Compare commits

..

7 commits

Author SHA1 Message Date
0318793cae [alembic] just one revision
Some checks failed
Python Quality & Tests / Lint, Typecheck, and Test (push) Failing after 31s
2026-04-16 15:19:47 +02:00
8fe597ce70 [workflow] bump 2026-04-15 17:24:27 +02:00
c9bef35303 [preset create] view for creating empty preset
Some checks failed
Python lint and test / linttest (push) Failing after 8s
2026-04-08 19:31:48 +02:00
2a5b40ea7b [cleanup] clean .dockerignore and pre commit hooks 2026-04-08 13:33:44 +02:00
3e31c26f5e tests bump
Some checks are pending
Python lint and test / linttest (push) Waiting to run
2026-04-08 13:29:25 +02:00
f2ab3e0498 readme 2026-04-08 12:02:35 +02:00
9ee27d70e2 [env.template] bump 2026-04-08 11:50:51 +02:00
19 changed files with 399 additions and 504 deletions

View file

@ -1 +1,11 @@
!negotiator/ ARCHITECTURE.md
doc
docker-compose.yml
Dockerfile
Makefile
mypy.ini
pyrightconfig.json
pytest.ini
Readme.md
setup.cfg
test.sh

View file

@ -1,36 +1,44 @@
name: Python lint and test name: Python Quality & Tests
on: on:
push: push:
branches: branches: [main, 'releases/**']
- main paths: ['**.py']
- 'releases/**'
paths:
- '**.py'
pull_request: pull_request:
branches: branches: [main, 'releases/**']
- main paths: ['**.py']
- 'releases/**'
paths: # Automatically cancel older runs of the same PR/branch if a new commit is pushed
- '**.py' concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs: jobs:
linttest: quality:
runs-on: ubuntu-22.04 name: Lint, Typecheck, and Test
runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - name: Checkout Code
- uses: actions/setup-python@v5 uses: actions/checkout@v4
- name: Install uv (Fastest Python Tooling)
uses: astral-sh/setup-uv@v3
with: with:
python-version: '3.12' enable-cache: true
- name: Install dependencies
run: | - name: Set up Python
python -m pip install --upgrade pip run: uv python install 3.14
pip install -r requirements_local.txt
- name: Run black - name: Install Dependencies
run: black --check fooder run: uv pip install -r requirements/local.txt
- name: Run flake8
run: flake8 fooder - name: Lint and Format (Ruff)
- name: Run mypy # Replaces Black and Flake8 in one go
run: mypy fooder run: uv run ruff check fooder && uv run ruff format --check fooder
- name: Run tests
run: ./test.sh - name: Type Check (Mypy)
run: uv run mypy fooder
- name: Run Tests
run: chmod +x ./test.sh && ./test.sh

View file

@ -1,5 +0,0 @@
repos:
- repo: https://github.com/ambv/black
rev: stable
hooks:
- id: black

View file

@ -30,6 +30,9 @@ lint: black mypy flake
version: version:
@echo $(VERSION) @echo $(VERSION)
openapi:
@curl http://localhost:8000/openapi.json -o doc/openapi.json
create-venv: create-venv:
python3 -m venv .venv --prompt="fooderapi-venv" --system-site-packages python3 -m venv .venv --prompt="fooderapi-venv" --system-site-packages
bash -c "source .venv/bin/activate && pip install -r requirements/local.txt" bash -c "source .venv/bin/activate && pip install -r requirements/local.txt"

View file

@ -2,51 +2,31 @@
Simple API for food diary application. It uses FastAPI and async postgres. Simple API for food diary application. It uses FastAPI and async postgres.
## Usage ## Overview and key points of configurating and deploying API
Simply build docker images with: API is maily suited for being run as docker container, either via docker compose or inside kubernetes/openshift, tho no configuration for those are provided in this repository. Example `docker-compose.yml` for deployment is provided in `doc/install/docker-compose.yml`.
```bash Configuration is passed through env variables. Sample configuration is provided in `doc/install/env.template`.
docker-compose build
```
And then start them up with: Database migrations are handled through alembic. For ease of use, `doc/install/Makefile` provides formula for `make migrate` that upgrades database to head migration.
```bash ## Development
docker-compose up -d
```
Create `.env` file with configuration by copying template `env.template`. *You have to generate secret keys!* They are stored under `SECRET_KEY` and `REFRESH_SECRET_KEY` Project architecture design was summarized in `ARCHITECTURE.md`, altho summary was generated with usage of AI for now so might not be perfect or informative ;)
in `.env` file and both could be generated with `openssl rand -hex 32`.
To initialize database exec: Overall, project consists of:
```bash - `fooder/alembic` - alembic python scripts, migrations, autogenerated
docker-compose exec api bash -c python -m fooder --create-tables - `fooder/context` - context with authorized session, repositories
``` - `fooder/domain` - sqlalchemy domain definitions
- `fooder/controller` - controllers for domain models, controllers hold logic linked with only one model
## Deployment - `fooder/repository` - repositories for domain models, again single repo per model, altho because of joins there are some methods that touch more than one model
- `fooder/model`- pydantic models
For deployment delete: - `fooder/view` - fastapi routes
- `fooder/utils` - utils
``` - `fooder/app` - fastapi app definitions
volumes: - `fooder/db` - sqlalchemy database core functionality
- ./fooder:/opt/fooder/fooder - `fooder/recaptcha` - recaptcha implementation
``` - `fooder/exc` - exceptions live there
- `fooder/router` - fastapi main router
and - `fooder/settings` - pydantic-settings for app
```
command: "uvicorn fooder.app:app --host 0.0.0.0 --port 8000 --reload"
```
lines from `docker-compose.yml`. This line makes app reload everytime code changes and is only recommended for development purposes.
I highly recommend using reverse proxy like nginx for HTTPS.
## Documentation
When api is started using docker, by default it runs on 8000 port on local machine (it can be changed within `docker-compose.yml` file). Swagger documentation is available
on `http://0.0.0.0:8000/docs` when you start the app.
Alternatively you can visit [my own instance of the API docs](https://fooderapi.domandoman.xyz/docs).

View file

@ -0,0 +1,26 @@
networks:
fooder:
driver: bridge
services:
database:
restart: unless-stopped
image: postgres
networks:
- fooder
env_file:
- .env
api:
restart: unless-stopped
image: registry.domandoman.xyz/fooder/api
build:
dockerfile: Dockerfile
context: .
platform: linux/amd64
networks:
- fooder
env_file:
- .env
ports:
- "8000:8000"

View file

@ -11,3 +11,5 @@ REFRESH_SECRET_KEY="${REFRESH_SECRET_KEY}" # generate with $ openssl rand -hex 3
ALGORITHM="HS256" ALGORITHM="HS256"
ACCESS_TOKEN_EXPIRE_MINUTES=30 ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=30 REFRESH_TOKEN_EXPIRE_DAYS=30
TURNSTILE_SECRET_KEY=1x0000000000000000000000000000000AA

File diff suppressed because one or more lines are too long

View file

@ -1,266 +0,0 @@
"""
Revision ID: 4e8d78ff6e9e
Revises: 97d77db27867
Create Date: 2026-04-07 14:41:20.398451
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "4e8d78ff6e9e"
down_revision: Union[str, Sequence[str], None] = "97d77db27867"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Upgrade schema."""
_now = sa.text("CURRENT_TIMESTAMP")
_zero_int = sa.text("0")
_zero_float = sa.text("0")
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
"userproductusage",
sa.Column("product_id", sa.Integer(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("count", sa.Integer(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["product_id"],
["product.id"],
),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("user_id", "product_id"),
)
op.create_table(
"usersettings",
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("protein_goal", sa.Float(), nullable=False),
sa.Column("carb_goal", sa.Float(), nullable=False),
sa.Column("fat_goal", sa.Float(), nullable=False),
sa.Column("fiber_goal", sa.Float(), nullable=False),
sa.Column("calories_goal", sa.Float(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("user_id"),
)
# Create default settings for all existing users
op.execute(
sa.text(
"INSERT INTO usersettings"
" (user_id, protein_goal, carb_goal, fat_goal, fiber_goal, calories_goal,"
" version, created_at, last_changed)"
" SELECT id, 0, 0, 0, 0, 0, 0, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP"
' FROM "user"'
)
)
op.drop_table("refreshtoken")
op.add_column(
"diary",
sa.Column(
"protein_goal", sa.Float(), nullable=False, server_default=_zero_float
),
)
op.add_column(
"diary",
sa.Column("carb_goal", sa.Float(), nullable=False, server_default=_zero_float),
)
op.add_column(
"diary",
sa.Column("fat_goal", sa.Float(), nullable=False, server_default=_zero_float),
)
op.add_column(
"diary",
sa.Column("fiber_goal", sa.Float(), nullable=False, server_default=_zero_float),
)
op.add_column(
"diary",
sa.Column(
"calories_goal", sa.Float(), nullable=False, server_default=_zero_float
),
)
op.add_column(
"diary",
sa.Column("version", sa.Integer(), nullable=False, server_default=_zero_int),
)
op.add_column(
"diary",
sa.Column("created_at", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column(
"diary",
sa.Column("last_changed", sa.DateTime(), nullable=False, server_default=_now),
)
op.create_unique_constraint(None, "diary", ["user_id", "date"])
op.add_column(
"entry",
sa.Column("version", sa.Integer(), nullable=False, server_default=_zero_int),
)
op.add_column(
"entry",
sa.Column("created_at", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column(
"meal",
sa.Column("version", sa.Integer(), nullable=False, server_default=_zero_int),
)
op.add_column(
"meal",
sa.Column("created_at", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column(
"meal",
sa.Column("last_changed", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column(
"preset",
sa.Column("version", sa.Integer(), nullable=False, server_default=_zero_int),
)
op.add_column(
"preset",
sa.Column("created_at", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column(
"preset",
sa.Column("last_changed", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column(
"presetentry",
sa.Column("version", sa.Integer(), nullable=False, server_default=_zero_int),
)
op.add_column(
"presetentry",
sa.Column("created_at", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column(
"product",
sa.Column("calories", sa.Float(), nullable=False, server_default=_zero_float),
)
op.add_column(
"product",
sa.Column("version", sa.Integer(), nullable=False, server_default=_zero_int),
)
op.add_column(
"product",
sa.Column("created_at", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column(
"product",
sa.Column("last_changed", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column("product", sa.Column("deleted_at", sa.DateTime(), nullable=True))
op.create_index(
"ix_product_barcode",
"product",
["barcode"],
unique=True,
postgresql_where=sa.text("deleted_at IS NULL"),
sqlite_where=sa.text("deleted_at IS NULL"),
)
op.drop_column("product", "hard_coded_calories")
op.drop_column("product", "usage_count_cached")
op.add_column(
"user",
sa.Column("version", sa.Integer(), nullable=False, server_default=_zero_int),
)
op.add_column(
"user",
sa.Column("created_at", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column(
"user",
sa.Column("last_changed", sa.DateTime(), nullable=False, server_default=_now),
)
op.add_column("user", sa.Column("deleted_at", sa.DateTime(), nullable=True))
# ### end Alembic commands ###
def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column("user", "deleted_at")
op.drop_column("user", "last_changed")
op.drop_column("user", "created_at")
op.drop_column("user", "version")
op.add_column(
"product",
sa.Column(
"usage_count_cached",
sa.BIGINT(),
autoincrement=False,
nullable=False,
),
)
op.add_column(
"product",
sa.Column(
"hard_coded_calories",
sa.DOUBLE_PRECISION(precision=53),
autoincrement=False,
nullable=True,
),
)
op.drop_index(
"ix_product_barcode",
table_name="product",
postgresql_where=sa.text("deleted_at IS NULL"),
sqlite_where=sa.text("deleted_at IS NULL"),
)
op.drop_column("product", "deleted_at")
op.drop_column("product", "last_changed")
op.drop_column("product", "created_at")
op.drop_column("product", "version")
op.drop_column("product", "calories")
op.drop_column("presetentry", "created_at")
op.drop_column("presetentry", "version")
op.drop_column("preset", "last_changed")
op.drop_column("preset", "created_at")
op.drop_column("preset", "version")
op.drop_column("meal", "last_changed")
op.drop_column("meal", "created_at")
op.drop_column("meal", "version")
op.drop_column("entry", "created_at")
op.drop_column("entry", "version")
op.drop_constraint(None, "diary", type_="unique")
op.drop_column("diary", "last_changed")
op.drop_column("diary", "created_at")
op.drop_column("diary", "version")
op.drop_column("diary", "calories_goal")
op.drop_column("diary", "fiber_goal")
op.drop_column("diary", "fat_goal")
op.drop_column("diary", "carb_goal")
op.drop_column("diary", "protein_goal")
op.create_table(
"refreshtoken",
sa.Column("user_id", sa.INTEGER(), autoincrement=False, nullable=False),
sa.Column("token", sa.VARCHAR(), autoincrement=False, nullable=False),
sa.Column("id", sa.INTEGER(), autoincrement=True, nullable=False),
sa.ForeignKeyConstraint(
["user_id"], ["user.id"], name=op.f("refreshtoken_user_id_fkey")
),
sa.PrimaryKeyConstraint("id", name=op.f("refreshtoken_pkey")),
)
op.drop_table("usersettings")
op.drop_table("userproductusage")
# ### end Alembic commands ###

View file

@ -1,71 +0,0 @@
"""
Revision ID: 564e5948f3ed
Revises: 4e8d78ff6e9e
Create Date: 2026-04-07 19:31:01.616100
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "564e5948f3ed"
down_revision: Union[str, Sequence[str], None] = "4e8d78ff6e9e"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.create_index(op.f("ix_entry_meal_id"), "entry", ["meal_id"], unique=False)
op.create_index(op.f("ix_entry_product_id"), "entry", ["product_id"], unique=False)
op.drop_column("entry", "processed")
op.create_index(op.f("ix_meal_diary_id"), "meal", ["diary_id"], unique=False)
op.create_index(op.f("ix_preset_user_id"), "preset", ["user_id"], unique=False)
op.create_index(
op.f("ix_presetentry_preset_id"),
"presetentry",
["preset_id"],
unique=False,
)
op.create_index(
op.f("ix_presetentry_product_id"),
"presetentry",
["product_id"],
unique=False,
)
op.create_index(
"ix_userproductusage_product_user",
"userproductusage",
["product_id", "user_id"],
unique=False,
)
op.create_index(
op.f("ix_userproductusage_user_id"),
"userproductusage",
["user_id"],
unique=False,
)
# ### end Alembic commands ###
def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f("ix_userproductusage_user_id"), table_name="userproductusage")
op.drop_index("ix_userproductusage_product_user", table_name="userproductusage")
op.drop_index(op.f("ix_presetentry_product_id"), table_name="presetentry")
op.drop_index(op.f("ix_presetentry_preset_id"), table_name="presetentry")
op.drop_index(op.f("ix_preset_user_id"), table_name="preset")
op.drop_index(op.f("ix_meal_diary_id"), table_name="meal")
op.add_column(
"entry",
sa.Column("processed", sa.BOOLEAN(), autoincrement=False, nullable=False),
)
op.drop_index(op.f("ix_entry_product_id"), table_name="entry")
op.drop_index(op.f("ix_entry_meal_id"), table_name="entry")
# ### end Alembic commands ###

View file

@ -1,31 +0,0 @@
"""merge heads
Revision ID: 7bda12666bf0
Revises: 564e5948f3ed, a1b2c3d4e5f6
Create Date: 2026-04-07 20:50:01.702218
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "7bda12666bf0"
down_revision: Union[str, Sequence[str], None] = (
"564e5948f3ed",
"a1b2c3d4e5f6",
)
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Upgrade schema."""
pass
def downgrade() -> None:
"""Downgrade schema."""
pass

View file

@ -1,32 +0,0 @@
"""
Revision ID: 97d77db27867
Revises:
Create Date: 2026-04-02 19:03:34.833774
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "97d77db27867"
down_revision: Union[str, Sequence[str], None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
pass
# ### end Alembic commands ###
def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
pass
# ### end Alembic commands ###

View file

@ -1,24 +0,0 @@
"""add unique constraint to user.username
Revision ID: a1b2c3d4e5f6
Revises: 4e8d78ff6e9e
Create Date: 2026-04-07 15:00:00.000000
"""
from typing import Sequence, Union
from alembic import op
revision: str = "a1b2c3d4e5f6"
down_revision: Union[str, Sequence[str], None] = "4e8d78ff6e9e"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
op.create_unique_constraint("uq_user_username", "user", ["username"])
def downgrade() -> None:
op.drop_constraint("uq_user_username", "user", type_="unique")

View file

@ -0,0 +1,255 @@
"""
Revision ID: d70663890e9d
Revises:
Create Date: 2026-04-16 13:17:38.973404
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision: str = "d70663890e9d"
down_revision: Union[str, Sequence[str], None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
"product",
sa.Column("name", sa.String(), nullable=False),
sa.Column("protein", sa.Float(), nullable=False),
sa.Column("carb", sa.Float(), nullable=False),
sa.Column("fat", sa.Float(), nullable=False),
sa.Column("fiber", sa.Float(), nullable=False),
sa.Column("calories", sa.Float(), nullable=False),
sa.Column("barcode", sa.String(), nullable=True),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.Column("deleted_at", sa.DateTime(), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
"ix_product_barcode",
"product",
["barcode"],
unique=True,
postgresql_where=sa.text("deleted_at IS NULL"),
sqlite_where=sa.text("deleted_at IS NULL"),
)
op.create_table(
"user",
sa.Column("username", sa.String(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.Column("hashed_password", sa.String(), nullable=False),
sa.Column("deleted_at", sa.DateTime(), nullable=True),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("username"),
)
op.create_table(
"diary",
sa.Column("date", sa.Date(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("protein_goal", sa.Float(), nullable=False),
sa.Column("carb_goal", sa.Float(), nullable=False),
sa.Column("fat_goal", sa.Float(), nullable=False),
sa.Column("fiber_goal", sa.Float(), nullable=False),
sa.Column("calories_goal", sa.Float(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("user_id", "date"),
)
op.create_table(
"preset",
sa.Column("name", sa.String(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_preset_user_id"), "preset", ["user_id"], unique=False
)
op.create_table(
"userproductusage",
sa.Column("product_id", sa.Integer(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("count", sa.Integer(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["product_id"],
["product.id"],
),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("user_id", "product_id"),
)
op.create_index(
"ix_userproductusage_product_user",
"userproductusage",
["product_id", "user_id"],
unique=False,
)
op.create_index(
op.f("ix_userproductusage_user_id"),
"userproductusage",
["user_id"],
unique=False,
)
op.create_table(
"usersettings",
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("protein_goal", sa.Float(), nullable=False),
sa.Column("carb_goal", sa.Float(), nullable=False),
sa.Column("fat_goal", sa.Float(), nullable=False),
sa.Column("fiber_goal", sa.Float(), nullable=False),
sa.Column("calories_goal", sa.Float(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("user_id"),
)
op.create_table(
"meal",
sa.Column("name", sa.String(), nullable=False),
sa.Column("order", sa.Integer(), nullable=False),
sa.Column("diary_id", sa.Integer(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["diary_id"],
["diary.id"],
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_meal_diary_id"), "meal", ["diary_id"], unique=False
)
op.create_table(
"presetentry",
sa.Column("grams", sa.Float(), nullable=False),
sa.Column("product_id", sa.Integer(), nullable=False),
sa.Column("preset_id", sa.Integer(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["preset_id"],
["preset.id"],
),
sa.ForeignKeyConstraint(
["product_id"],
["product.id"],
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_presetentry_preset_id"),
"presetentry",
["preset_id"],
unique=False,
)
op.create_index(
op.f("ix_presetentry_product_id"),
"presetentry",
["product_id"],
unique=False,
)
op.create_table(
"entry",
sa.Column("grams", sa.Float(), nullable=False),
sa.Column("product_id", sa.Integer(), nullable=False),
sa.Column("meal_id", sa.Integer(), nullable=False),
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("version", sa.Integer(), nullable=False),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("last_changed", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["meal_id"],
["meal.id"],
),
sa.ForeignKeyConstraint(
["product_id"],
["product.id"],
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_entry_meal_id"), "entry", ["meal_id"], unique=False
)
op.create_index(
op.f("ix_entry_product_id"), "entry", ["product_id"], unique=False
)
# ### end Alembic commands ###
def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f("ix_entry_product_id"), table_name="entry")
op.drop_index(op.f("ix_entry_meal_id"), table_name="entry")
op.drop_table("entry")
op.drop_index(op.f("ix_presetentry_product_id"), table_name="presetentry")
op.drop_index(op.f("ix_presetentry_preset_id"), table_name="presetentry")
op.drop_table("presetentry")
op.drop_index(op.f("ix_meal_diary_id"), table_name="meal")
op.drop_table("meal")
op.drop_table("usersettings")
op.drop_index(
op.f("ix_userproductusage_user_id"), table_name="userproductusage"
)
op.drop_index(
"ix_userproductusage_product_user", table_name="userproductusage"
)
op.drop_table("userproductusage")
op.drop_index(op.f("ix_preset_user_id"), table_name="preset")
op.drop_table("preset")
op.drop_table("diary")
op.drop_table("user")
op.drop_index(
"ix_product_barcode",
table_name="product",
postgresql_where=sa.text("deleted_at IS NULL"),
sqlite_where=sa.text("deleted_at IS NULL"),
)
op.drop_table("product")
# ### end Alembic commands ###

View file

@ -23,6 +23,10 @@ class PresetUpdateModel(BaseModel):
name: str | None = None name: str | None = None
class PresetCreateModel(BaseModel):
name: str
class LoadPresetAsMealModel(BaseModel): class LoadPresetAsMealModel(BaseModel):
preset_id: int preset_id: int
name: str | None = None name: str | None = None

View file

@ -81,7 +81,10 @@ async def test_update_diary_partial_update(auth_client, diary, user_settings):
body = response.json() body = response.json()
assert body["protein_goal"] == 180.0 assert body["protein_goal"] == 180.0
assert body["carb_goal"] == user_settings.carb_goal assert body["carb_goal"] == user_settings.carb_goal
assert body["calories_goal"] == user_settings.calories_goal assert body["fat_goal"] == user_settings.fat_goal
assert (
body["calories_goal"] != user_settings.calories_goal
) # recalculated since None
async def test_update_diary_all_goals(auth_client, diary): async def test_update_diary_all_goals(auth_client, diary):

View file

@ -92,6 +92,25 @@ async def test_list_presets_without_auth_returns_401(client):
assert response.status_code == 401 assert response.status_code == 401
# --- create preset ---
async def test_create_preset_returns_201(auth_client):
response = await auth_client.post("/api/preset/", json={"name": "Preset name"})
assert response.status_code == 201
async def test_create_preset_sets_name(auth_client):
response = await auth_client.post("/api/preset/", json={"name": "Preset name"})
assert response.status_code == 201
assert response.json()["name"] == "Preset name"
async def test_create_preset_without_auth_returns_401(client):
response = await client.post("/api/preset/", json={"name": "x"})
assert response.status_code == 401
# --- update preset --- # --- update preset ---

View file

@ -58,7 +58,9 @@ async def test_create_user_password_too_short_returns_422(client, new_user_paylo
assert response.status_code == 422 assert response.status_code == 422
async def test_create_user_captcha_failure_returns_403(client, monkeypatch, new_user_payload): async def test_create_user_captcha_failure_returns_403(
client, monkeypatch, new_user_payload
):
async def _fail(token: str, ip=None) -> None: async def _fail(token: str, ip=None) -> None:
raise CaptchaFailed() raise CaptchaFailed()
@ -75,7 +77,9 @@ async def test_change_password_returns_204(auth_client, user_password):
assert response.status_code == 204 assert response.status_code == 204
async def test_change_password_can_login_with_new_password(auth_client, user, user_password): async def test_change_password_can_login_with_new_password(
auth_client, user, user_password
):
new_password = "newpassword1" new_password = "newpassword1"
await auth_client.patch( await auth_client.patch(
"/api/user/password", "/api/user/password",

View file

@ -4,7 +4,7 @@ from fooder.context import AuthContextDependency, Context
from fooder.controller.preset import PresetController from fooder.controller.preset import PresetController
from fooder.controller.preset_entry import PresetEntryController from fooder.controller.preset_entry import PresetEntryController
from fooder.model.entry import EntryCreateModel, EntryUpdateModel from fooder.model.entry import EntryCreateModel, EntryUpdateModel
from fooder.model.preset import PresetModel, PresetUpdateModel from fooder.model.preset import PresetCreateModel, PresetModel, PresetUpdateModel
from fooder.model.preset_entry import PresetEntryModel from fooder.model.preset_entry import PresetEntryModel
router = APIRouter() router = APIRouter()
@ -23,6 +23,16 @@ async def list_presets(
) )
@router.post("/", response_model=PresetModel, status_code=201)
async def create_preset(
data: PresetCreateModel,
ctx: Context = Depends(_auth_ctx),
):
async with ctx.repo.transaction():
ctrl = await PresetController.create(ctx, data.name)
return ctrl.obj
@router.patch("/{preset_id}", response_model=PresetModel) @router.patch("/{preset_id}", response_model=PresetModel)
async def update_preset( async def update_preset(
preset_id: int, preset_id: int,