Compare commits

...

13 commits

Author SHA1 Message Date
Benno Tielen
d77a29ecd7 fix: icons
Some checks are pending
Deploy / deploy (push) Waiting to run
2026-04-11 15:33:51 +02:00
Benno Tielen
b0e7463090 feature: image card block element 2026-04-11 15:16:51 +02:00
Benno Tielen
e17c161ee3 fix: add payload cms data 2026-04-10 11:40:55 +02:00
Benno Tielen
005f91b898 feature: button in richtext 2026-04-10 11:39:30 +02:00
Benno Tielen
36956d7daf feature: infrastructure for deployment 2026-04-10 11:39:02 +02:00
Benno Tielen
3bf9b9fdc3 fix: add missing ico 2026-04-10 11:38:02 +02:00
Benno Tielen
46882e175d feature: events 2026-04-09 16:06:31 +02:00
Benno Tielen
a49daabac7 feature: favicon per site 2026-04-09 15:38:32 +02:00
Benno Tielen
3e05a9df82 fix: slug 2026-04-09 12:00:08 +02:00
Benno Tielen
0f902edc1b fix: no wrap 2026-04-09 11:47:47 +02:00
Benno Tielen
c19f8725fe fix: logo 2026-04-09 11:24:39 +02:00
Benno Tielen
d4cfec3a98 feature: schedule mass times 2026-04-09 11:12:54 +02:00
Benno Tielen
ee0fa3678f fix: homepage 2026-04-08 14:45:34 +02:00
78 changed files with 122314 additions and 450 deletions

5
.gitignore vendored
View file

@ -44,6 +44,11 @@ next-env.d.ts
.env .env
# Payload CMS data
/media /media
/documents
*storybook.log *storybook.log
# Generated per-site favicon (copied from sites/<NEXT_PUBLIC_SITE_ID>/icon.ico by scripts/copy-favicon.mjs)
/src/app/(home)/icon.ico

View file

@ -73,6 +73,42 @@ Note: After deploying changes to production, ensure migrations are executed agai
Once the server is running, open: Once the server is running, open:
- http://localhost:3000/admin - http://localhost:3000/admin
## Recurring mass times (jobs queue)
Churches have a `recurringSchedule` field on their admin page where editors can
configure wiederkehrende Gottesdienste — weekly, biweekly (with an anchor date
for the 2-week parity), or monthly by weekday (e.g. every 3rd Sunday). These
entries are not rendered directly: a background job materializes them into real
`Worship` documents for a rolling window a few weeks into the future, so the
existing `MassTimesBlock`, gemeinde pages and `/gottesdienst/{id}` detail pages
work unchanged. Generated documents are normal `Worship` docs — an editor can
still cancel a specific occurrence, change the celebrant, etc., and the job
will never overwrite manual edits (it only ever appends new rows).
The task is defined in `src/jobs/generateRecurringMasses.ts` and is triggered
three ways:
1. **On save**`Churches.afterChange` queues a job whenever a schedule is
edited.
2. **Weekly cron** — the task carries its own `schedule` so Payload auto-queues
it every Monday at 03:00 to keep the rolling window populated.
3. **Manually** — queue from code with
`payload.jobs.queue({ task: 'generateRecurringMasses' })`, or run the full
queue from the CLI:
```bash
npm run payload jobs:run --cron "* * * * *" --queue default
```
`autoRun` in `payload.config.ts` only fires in production
(`NODE_ENV === 'production'`), so in development you either need the CLI
command above, flip `shouldAutoRun` to `true` temporarily, or hit the "Run now"
button on the Payload Jobs admin page.
> Note: Payload's `autoRun` requires a long-running server. This project ships
> via Docker, so that's fine — but if the app ever moves to a serverless host
> like Vercel, `autoRun` must be replaced with an external cron (e.g.
> Vercel Cron) that invokes the CLI command above.
## Scripts ## Scripts
Common scripts available in `package.json`: Common scripts available in `package.json`:
- `npm run dev` — start Next.js dev server - `npm run dev` — start Next.js dev server

290
infra/README.md Normal file
View file

@ -0,0 +1,290 @@
# Infrastructure & Deployment
## Architecture
```
VPS (Ubuntu 24, 8 GB RAM)
├── Caddy — reverse proxy + auto SSL (native)
├── PostgreSQL — postgis/postgis:16-3.4 (Docker)
├── Forgejo — git server + CI/CD (Docker)
├── Forgejo Runner — executes CI/CD jobs (Docker)
├── app-staging — Next.js + Payload CMS (Docker)
└── app-test — Next.js + Payload CMS (Docker)
```
| URL | Port | Purpose |
|-----|------|---------|
| mutter-teresa.skick.app | 3001 | Client demo (staging) |
| mutter-teresa-test.skick.app | 3002 | Developer testing |
| git.skick.app | 3003 | Forgejo git server |
All app and database containers share the Docker network `church-website-net`.
---
## Prerequisites
- **Ansible** installed locally (`pip install ansible` or `brew install ansible`)
- **SSH access** to the VPS (root or sudo user)
- **DNS records** pointing to the VPS IP:
- `mutter-teresa.skick.app` → VPS IP
- `mutter-teresa-test.skick.app` → VPS IP
- `git.skick.app` → VPS IP
---
## Quick Start: First-Time Server Setup
### 1. Configure secrets
Create an encrypted vault from the example template:
```bash
cd infra/ansible
cp inventory/group_vars/all/vault.yml.example inventory/group_vars/all/vault.yml
ansible-vault encrypt inventory/group_vars/all/vault.yml
ansible-vault edit inventory/group_vars/all/vault.yml
```
Fill in all `CHANGE_ME` values:
- `vault_ansible_become_pass` — VPS root password
- `vault_postgres_root_password` — PostgreSQL root password
- `vault_db_password_staging` / `vault_db_password_test` — database passwords
- `vault_payload_secret_staging` / `vault_payload_secret_test` — Payload CMS secrets
- `vault_google_bucket` — Google Cloud Storage bucket name
- `vault_resend_api_key` — Resend email API key
- `vault_repo_url` — Forgejo repository URL (e.g., `ssh://git@git.skick.app:2222/org/church-website.git`)
### 2. Configure inventory
Edit `infra/ansible/inventory/test.yml`:
- Set `ansible_host` to your VPS IP address
- Adjust `ansible_user` and SSH key path if needed
### 3. Run the playbook
```bash
cd infra/ansible
ansible-playbook playbooks/setup.yml -i inventory/test.yml --ask-vault-pass
```
This will:
1. Install Docker, configure firewall
2. Start PostgreSQL with both databases
3. Install and configure Caddy with SSL
4. Start Forgejo and the CI/CD runner
5. Clone the repo, build, and deploy both environments
### 4. Set up Forgejo
After the playbook completes:
1. Visit `https://git.skick.app` and complete the initial Forgejo setup
2. Create an organization and repository
3. Add the VPS SSH key to the repository for pull access
4. Register the Forgejo Runner:
```bash
ssh root@YOUR_VPS_IP
docker exec -it forgejo-runner forgejo-runner register \
--instance https://git.skick.app \
--token YOUR_RUNNER_TOKEN \
--name local-runner \
--labels ubuntu-latest:docker://node:22
```
5. Push to the `staging` branch — CI/CD will deploy automatically
---
## Environment Variables
| Variable | Description | Build-time? |
|----------|-------------|-------------|
| `DATABASE_URI` | PostgreSQL connection string | No |
| `PAYLOAD_SECRET` | Payload CMS encryption secret | No |
| `NEXT_PUBLIC_SERVER_URL` | Public URL of the app | Yes |
| `NEXT_PUBLIC_SITE_ID` | Site identifier (e.g., `chemnitz`) | Yes |
| `GOOGLE_BUCKET` | GCS bucket for media storage | No |
| `RESEND_API_KEY` | Resend API key for emails | No |
Variables marked "Build-time" are baked into the Docker image during `docker build` (via `--build-arg`). Changes to these require a rebuild.
---
## Manual Operations
### Check container logs
```bash
docker logs app-staging
docker logs app-test
docker logs postgres
docker logs forgejo
```
### Redeploy manually (without CI/CD)
```bash
cd /opt/church-website/repo
git pull origin staging
/opt/church-website/scripts/deploy.sh staging 3001
/opt/church-website/scripts/deploy.sh test 3002
```
### Run migrations manually
```bash
docker exec app-staging npx payload migrate
docker exec app-test npx payload migrate
```
### Database backup
```bash
# Backup staging database
docker exec postgres pg_dump -U church_website_staging church_website_staging > backup_staging_$(date +%Y%m%d).sql
# Backup test database
docker exec postgres pg_dump -U church_website_test church_website_test > backup_test_$(date +%Y%m%d).sql
# Backup all databases
docker exec postgres pg_dumpall -U postgres > backup_all_$(date +%Y%m%d).sql
```
### Database restore
```bash
# Restore staging database
cat backup_staging.sql | docker exec -i postgres psql -U church_website_staging church_website_staging
```
### Restart a single service
```bash
docker restart app-staging
docker restart app-test
docker restart postgres
```
---
## Deploy via Ansible (without CI/CD)
Use the `deploy.yml` playbook to deploy from your local machine — no Forgejo runner or CI/CD pipeline needed. This is useful for hotfixes, CI outages, or production servers without Forgejo.
```bash
cd infra/ansible
# Deploy to test/staging VPS
ansible-playbook playbooks/deploy.yml -i inventory/test.yml --ask-vault-pass
# Deploy to production
ansible-playbook playbooks/deploy.yml -i inventory/production.yml --ask-vault-pass
```
**What it does:**
1. Pulls the latest code from the configured branch (`repo_branch` in inventory)
2. Runs `deploy.sh` for each environment (sequentially to save RAM), which:
- Builds the Docker app image with build-time env vars
- Builds a migration image and runs `npx payload migrate`
- Stops the old container, starts the new one
- Prunes old Docker images
**Deploy a specific branch:**
```bash
ansible-playbook playbooks/deploy.yml -i inventory/test.yml --ask-vault-pass \
-e repo_branch=feature/my-branch
```
**Deploy only one environment** (e.g., just staging):
```bash
ansible-playbook playbooks/deploy.yml -i inventory/test.yml --ask-vault-pass \
-e '{"app_environments": [{"name": "staging", "port": 3001}]}'
```
> **Note:** The server must already be provisioned with `setup.yml` before using `deploy.yml`. The deploy playbook only pulls code and rebuilds containers — it does not install Docker, Caddy, or PostgreSQL.
---
## CI/CD
The Forgejo Actions workflow (`.forgejo/workflows/deploy.yml`) triggers on push to the `staging` branch. It:
1. Pulls the latest code on the VPS
2. Builds a new Docker image for staging
3. Stops the old container, starts the new one
4. Runs database migrations
5. Repeats for the test environment (sequentially, to save RAM)
---
## Adding a New Environment
1. Add a new entry to `app_environments` in the inventory file
2. Add a new entry to `caddy_domains` with the new domain
3. Add a new database entry to `databases`
4. Run the playbook: `ansible-playbook playbooks/setup.yml -i inventory/test.yml`
5. Update the deploy workflow to include the new environment
---
## Production Setup
1. Copy and edit the production inventory:
```bash
cp infra/ansible/inventory/production.yml infra/ansible/inventory/my-production.yml
```
2. Fill in the production VPS IP, domain, and secrets
3. Run the playbook (skip Forgejo):
```bash
ansible-playbook playbooks/setup.yml -i inventory/my-production.yml --ask-vault-pass
```
4. Set up a deploy workflow for production (triggered on tags/releases)
---
## Troubleshooting
### Build fails with OOM
The VPS has 4 GB RAM + 2 GB swap. Docker builds can peak at ~1.5 GB. If builds fail:
- Ensure only one build runs at a time (deploy script is sequential)
- Check swap: `free -h`
- Increase swap: edit `swap_size_mb` in inventory and re-run playbook
### SSL certificate not working
- Ensure DNS records point to the VPS IP: `dig mutter-teresa.skick.app`
- Check Caddy logs: `journalctl -u caddy`
- Caddy auto-renews certificates — if stuck, restart: `systemctl restart caddy`
### Database connection refused
- Check PostgreSQL is running: `docker ps | grep postgres`
- Check the container is on the right network: `docker network inspect church-website-net`
- Test connection: `docker exec postgres psql -U postgres -l`
### Container won't start
- Check logs: `docker logs app-staging`
- Check if port is in use: `ss -tlnp | grep 3001`
- Check .env file: `cat /opt/church-website/envs/staging/.env`
---
## Local Development
For local development with PostgreSQL:
```bash
# Start PostgreSQL (from project root)
docker compose up -d
# Configure .env
DATABASE_URI=postgres://postgres:password@localhost:5432/church_website_dev
# Start dev server
npm run dev
```

10
infra/ansible/ansible.cfg Normal file
View file

@ -0,0 +1,10 @@
[defaults]
inventory = inventory/
roles_path = roles/
host_key_checking = False
retry_files_enabled = False
remote_tmp = /tmp/.ansible/tmp
[privilege_escalation]
become = True
become_method = sudo

View file

@ -0,0 +1,3 @@
---
# Non-secret shared variables
# Secrets go in vault.yml (encrypted) in this same directory

View file

@ -0,0 +1,45 @@
$ANSIBLE_VAULT;1.1;AES256
35356634333331616130643630356337646335653935313561396561366261356265373038363564
6433623739346632353765303637636565613263326165380a346134336466323661393563626663
31343664373132666566383764336532663830623435333537313136333336633938326236633438
3561646361323539640a656134343263316563383837633931653066336238636239643465373236
66323966323433613466666233353731353738386665383239316338333161646264663331613162
30366637623136616137306663383030346631623432343037313239386666626266383036333537
39636362643430623937346633666264353137623564353138393431393866386538613962643661
36373862346636393730663665393564636463366433396533333162626232643331643338343037
35336662306338616561653762313465363538386636303331323133383633386332663063653764
63613565313864646362643736393135303435343162313864663038613865643631386337326534
61613131616163323735643432656664396135633263346530383034323865353139613662356437
32393934386139353130303865316237353865376232653563356236366435373963393237646337
37306266363064636130633134306666326365316161383133373334313239343831396364646333
62663662306534663638316364333730376631336332333364653462326263333861353836643739
64656361643035663635643461616166663534356638613434636565356461353234303633633164
37613032346663623733313966383736393838323361366237383033373133656232363833656161
63646635373237636266313966666336353831373130333163333864616437636362623836636535
61646263373830663166323736666333386234623430643636333066363061646161393935663661
33393232653137303762643663396663653563646662363061633338333136303134313732356136
34616233653562323263356530633636383465353735316238653330316164333032643064313662
36643662323133613933363534313263633365373761663466376462326237303337396566366466
31656639383063653962666233336166633930656534363961306238623439626261336465306538
38343039383132313837376531353138333339303964313931393533633261303035323331613132
61636366363966373964396232323932666663316334383863633761666330376332383564326632
35383038353366633038623239386462386165643630623561663963343035623837353230323235
65303635613265613537333335373030613237333463373061363366633063653365383139326131
30333765656338356135616566316639646238326162653033643663393032333461343661363736
30393565386134623734343165333164653532366337373430356664353637343166363430313137
61336666386662363066323164613539383366656533393766633136303534393464623334633762
34326664386464666461653536393665313239323937393465306634393663636364656438303963
64643634373766323465613638653833623235663738626431616330623262366635373334643838
64663861653861343163313836643333643730643838613364646236656337393036336366646362
33363033663461313933323637623736303131643962333665616265396566663136303236323564
31373665316232656239366466373932393336376437626465616233663430636362636532626661
30633866666263656439313236663630383733306439633936626139356235366439383030613832
63666333356466323733323737366131333033376432646162626633356438373639306133623531
34303463626561633161396262323639353135316137643934383635636136303833333934633830
32346534383564386364643262643936383233306133653661356138336563616261363232613935
39613134333435303535336235613262346162613566636433383266623162663463663862393363
63393366663231633265616463616363396264626666346666303937353665383565636238336231
38396634323337663133386639633662663462623731323134313939613437333537333666303466
31616230623739323364376663333730633464653434313333646466623562316466613435346566
35313866616530643930326238306339613138646664316639663033303666643661373839356235
39646463386236633463

View file

@ -0,0 +1,17 @@
---
# Copy this file to vault.yml and encrypt it:
# cp vault.yml.example vault.yml
# ansible-vault encrypt vault.yml
# ansible-vault edit vault.yml
vault_ansible_become_pass: "CHANGE_ME"
vault_postgres_root_password: "CHANGE_ME"
vault_db_password_staging: "CHANGE_ME"
vault_db_password_test: "CHANGE_ME"
vault_db_password: "CHANGE_ME"
vault_payload_secret_staging: "CHANGE_ME"
vault_payload_secret_test: "CHANGE_ME"
vault_payload_secret: "CHANGE_ME"
vault_google_bucket: "CHANGE_ME"
vault_resend_api_key: "CHANGE_ME"
vault_repo_url: "ssh://git@git.skick.app:2222/org/church-website.git"

View file

@ -0,0 +1,42 @@
# Production inventory — fill in when ready
all:
hosts:
production-vps:
ansible_host: YOUR_PRODUCTION_VPS_IP
ansible_user: root
ansible_ssh_private_key_file: ~/.ssh/id_ed25519
vars:
swap_size_mb: 2048
docker_network: church-website-net
postgres_container_name: postgres
postgres_image: postgis/postgis:16-3.4
postgres_volume: pgdata
databases:
- name: church_website
user: church_website
password: "{{ vault_db_password }}"
caddy_domains:
- domain: YOUR_PRODUCTION_DOMAIN
proxy_port: 3001
app_environments:
- name: production
port: 3001
domain: YOUR_PRODUCTION_DOMAIN
db_name: church_website
db_user: church_website
db_password: "{{ vault_db_password }}"
payload_secret: "{{ vault_payload_secret }}"
site_id: chemnitz
google_bucket: "{{ vault_google_bucket }}"
resend_api_key: "{{ vault_resend_api_key }}"
repo_dir: /opt/church-website/repo
envs_dir: /opt/church-website/envs
scripts_dir: /opt/church-website/scripts
repo_url: "{{ vault_repo_url }}"
repo_branch: master

View file

@ -0,0 +1,70 @@
all:
hosts:
test-vps:
ansible_host: 178.104.35.59
ansible_user: root
ansible_ssh_pass: "{{ vault_ansible_become_pass }}"
#ansible_ssh_private_key_file: ~/.ssh/id_ed25519
vars:
# Docker
docker_network: church-website-net
# PostgreSQL
postgres_container_name: postgres
postgres_image: postgis/postgis:16-3.4
postgres_volume: pgdata
# Databases
databases:
- name: church_website_staging
user: church_website_staging
password: "{{ vault_db_password_staging }}"
- name: church_website_test
user: church_website_test
password: "{{ vault_db_password_test }}"
# Caddy
caddy_domains:
- domain: mutter-teresa.skick.app
proxy_port: 3001
- domain: mutter-teresa-test.skick.app
proxy_port: 3002
- domain: git.skick.app
proxy_port: 3003
# Forgejo
forgejo_domain: git.skick.app
forgejo_container_name: forgejo
forgejo_port: 3003
forgejo_ssh_port: 2222
# App environments
app_environments:
- name: staging
port: 3001
domain: mutter-teresa.skick.app
db_name: church_website_staging
db_user: church_website_staging
db_password: "{{ vault_db_password_staging }}"
payload_secret: "{{ vault_payload_secret_staging }}"
site_id: chemnitz
google_bucket: "{{ vault_google_bucket }}"
resend_api_key: "{{ vault_resend_api_key }}"
- name: test
port: 3002
domain: mutter-teresa-test.skick.app
db_name: church_website_test
db_user: church_website_test
db_password: "{{ vault_db_password_test }}"
payload_secret: "{{ vault_payload_secret_test }}"
site_id: chemnitz
google_bucket: "{{ vault_google_bucket }}"
resend_api_key: "{{ vault_resend_api_key }}"
# Repo
repo_dir: /opt/church-website/repo
envs_dir: /opt/church-website/envs
scripts_dir: /opt/church-website/scripts
repo_url: "{{ vault_repo_url }}"
repo_branch: staging

View file

@ -0,0 +1,154 @@
---
- name: Copy staging data to test environment
hosts: all
become: true
vars:
staging_db: church_website_staging
test_db: church_website_test
test_db_user: church_website_test
test_container: app-test
test_port: 3002
test_image: "church-website:test"
tasks:
# ── Phase 1: Pre-flight ───────────────────────────────────────────
- name: Verify postgres container is running
ansible.builtin.shell: docker ps --filter name=^{{ postgres_container_name }}$ --format '{{ '{{' }}.Status{{ '}}' }}'
register: pg_status
changed_when: false
failed_when: "'Up' not in pg_status.stdout"
- name: Verify staging database exists
ansible.builtin.shell: >
docker exec {{ postgres_container_name }}
psql -U postgres -tAc "SELECT 1 FROM pg_database WHERE datname = '{{ staging_db }}'"
register: staging_exists
changed_when: false
failed_when: "'1' not in staging_exists.stdout"
# ── Phase 2: Stop test app ────────────────────────────────────────
- name: Stop test container
ansible.builtin.shell: docker stop {{ test_container }} 2>/dev/null || true
changed_when: false
- name: Remove test container
ansible.builtin.shell: docker rm {{ test_container }} 2>/dev/null || true
changed_when: false
# ── Phase 3: Database copy ────────────────────────────────────────
- name: Terminate connections to test database
ansible.builtin.shell: >
docker exec {{ postgres_container_name }}
psql -U postgres -c
"SELECT pg_terminate_backend(pid) FROM pg_stat_activity WHERE datname = '{{ test_db }}' AND pid <> pg_backend_pid();"
changed_when: false
- name: Drop test database
ansible.builtin.shell: >
docker exec {{ postgres_container_name }}
psql -U postgres -c "DROP DATABASE IF EXISTS {{ test_db }};"
- name: Create test database
ansible.builtin.shell: >
docker exec {{ postgres_container_name }}
psql -U postgres -c "CREATE DATABASE {{ test_db }} OWNER {{ test_db_user }};"
- name: Enable PostGIS extension on test database
ansible.builtin.shell: >
docker exec {{ postgres_container_name }}
psql -U postgres -d {{ test_db }} -c "CREATE EXTENSION IF NOT EXISTS postgis;"
- name: Dump staging and restore into test
ansible.builtin.shell: >
docker exec {{ postgres_container_name }}
bash -c "pg_dump -U postgres --no-owner --no-acl {{ staging_db }} | psql -U postgres -d {{ test_db }}"
- name: Reassign ownership to test user
ansible.builtin.shell: |
docker exec {{ postgres_container_name }} psql -U postgres -d {{ test_db }} -c "
DO \$\$
DECLARE
r RECORD;
BEGIN
FOR r IN SELECT tablename FROM pg_tables WHERE schemaname = 'public' LOOP
EXECUTE 'ALTER TABLE public.' || quote_ident(r.tablename) || ' OWNER TO {{ test_db_user }}';
END LOOP;
FOR r IN SELECT sequencename FROM pg_sequences WHERE schemaname = 'public' LOOP
EXECUTE 'ALTER SEQUENCE public.' || quote_ident(r.sequencename) || ' OWNER TO {{ test_db_user }}';
END LOOP;
FOR r IN SELECT typname FROM pg_type t
WHERE t.typnamespace = 'public'::regnamespace
AND t.typtype = 'e'
AND NOT EXISTS (
SELECT 1 FROM pg_depend d
WHERE d.objid = t.oid AND d.deptype = 'e'
)
LOOP
EXECUTE 'ALTER TYPE public.' || quote_ident(r.typname) || ' OWNER TO {{ test_db_user }}';
END LOOP;
END
\$\$;
"
- name: Verify tables exist in test database
ansible.builtin.shell: >
docker exec {{ postgres_container_name }}
psql -U postgres -d {{ test_db }} -tAc "SELECT count(*) FROM pg_tables WHERE schemaname = 'public';"
register: table_count
changed_when: false
failed_when: "table_count.stdout | int < 1"
# ── Phase 4: Volume copy ─────────────────────────────────────────
- name: Copy media volume from staging to test
ansible.builtin.shell: >
docker run --rm
-v uploads-staging-media:/source:ro
-v uploads-test-media:/target
alpine sh -c "rm -rf /target/* && cp -a /source/. /target/"
- name: Copy documents volume from staging to test
ansible.builtin.shell: >
docker run --rm
-v uploads-staging-documents:/source:ro
-v uploads-test-documents:/target
alpine sh -c "rm -rf /target/* && cp -a /source/. /target/"
# ── Phase 5: Restart test app ─────────────────────────────────────
- name: Start test container
ansible.builtin.shell: >
docker run -d
--name {{ test_container }}
--restart unless-stopped
--network {{ docker_network }}
--env-file {{ envs_dir }}/test/.env
-v uploads-test-media:/app/media
-v uploads-test-documents:/app/documents
-p 127.0.0.1:{{ test_port }}:3000
{{ test_image }}
- name: Fix volume permissions
ansible.builtin.shell: >
docker exec -u 0 {{ test_container }}
chown -R 1001:1001 /app/media /app/documents
# ── Phase 6: Health check ─────────────────────────────────────────
- name: Wait for test app to be healthy
ansible.builtin.uri:
url: "http://127.0.0.1:{{ test_port }}"
method: GET
status_code: [200, 301, 302]
register: health
retries: 10
delay: 5
until: health.status in [200, 301, 302]
- name: Print summary
ansible.builtin.debug:
msg: |
Staging → Test copy complete!
- Database: {{ staging_db }} → {{ test_db }} ({{ table_count.stdout }} tables)
- Media & documents volumes copied
- Test app running on port {{ test_port }}
- Health check: HTTP {{ health.status }}
- URL: https://mutter-teresa-test.skick.app

View file

@ -0,0 +1,19 @@
---
- name: Deploy app (rebuild + restart)
hosts: all
become: true
tasks:
- name: Pull latest code
ansible.builtin.git:
repo: "{{ repo_url }}"
dest: "{{ repo_dir }}"
version: "{{ repo_branch }}"
force: true
- name: Deploy each environment
ansible.builtin.shell: |
{{ scripts_dir }}/deploy.sh {{ item.name }} {{ item.port }}
loop: "{{ app_environments }}"
loop_control:
label: "{{ item.name }}"

View file

@ -0,0 +1,11 @@
---
- name: Set up church-website server
hosts: all
become: true
roles:
- common
- postgresql
- caddy
- forgejo
- app

View file

@ -0,0 +1,9 @@
---
- name: Deploy each environment (sequentially to save RAM)
ansible.builtin.shell: |
{{ scripts_dir }}/deploy.sh {{ item.name }} {{ item.port }}
loop: "{{ app_environments }}"
loop_control:
label: "{{ item.name }}"
register: deploy_result
changed_when: true

View file

@ -0,0 +1,26 @@
---
- name: Deploy deploy script
ansible.builtin.copy:
src: "{{ playbook_dir }}/../../scripts/deploy.sh"
dest: "{{ scripts_dir }}/deploy.sh"
mode: "0755"
- name: Deploy .env files
ansible.builtin.template:
src: env.j2
dest: "{{ envs_dir }}/{{ item.name }}/.env"
mode: "0640"
loop: "{{ app_environments }}"
loop_control:
label: "{{ item.name }}"
- name: Clone or update repository
ansible.builtin.git:
repo: "{{ repo_url }}"
dest: "{{ repo_dir }}"
version: "{{ repo_branch }}"
force: true
accept_hostkey: true
- name: Build and deploy
ansible.builtin.include_tasks: deploy.yml

View file

@ -0,0 +1,6 @@
DATABASE_URI=postgres://{{ item.db_user }}:{{ item.db_password }}@{{ postgres_container_name }}:5432/{{ item.db_name }}
PAYLOAD_SECRET={{ item.payload_secret }}
NEXT_PUBLIC_SERVER_URL=https://{{ item.domain }}
NEXT_PUBLIC_SITE_ID={{ item.site_id }}
GOOGLE_BUCKET={{ item.google_bucket }}
RESEND_API_KEY={{ item.resend_api_key }}

View file

@ -0,0 +1,43 @@
---
- name: Install Caddy dependencies
ansible.builtin.apt:
name:
- debian-keyring
- debian-archive-keyring
- apt-transport-https
- curl
state: present
- name: Add Caddy GPG key
ansible.builtin.shell:
cmd: curl -fsSL https://dl.cloudsmith.io/public/caddy/stable/gpg.key -o /etc/apt/keyrings/caddy-stable-archive-keyring.asc && chmod 644 /etc/apt/keyrings/caddy-stable-archive-keyring.asc
creates: /etc/apt/keyrings/caddy-stable-archive-keyring.asc
- name: Add Caddy apt repository
ansible.builtin.apt_repository:
repo: "deb [signed-by=/etc/apt/keyrings/caddy-stable-archive-keyring.asc] https://dl.cloudsmith.io/public/caddy/stable/deb/ubuntu any-version main"
state: present
- name: Install Caddy
ansible.builtin.apt:
name: caddy
state: present
update_cache: true
- name: Deploy Caddyfile
ansible.builtin.template:
src: Caddyfile.j2
dest: /etc/caddy/Caddyfile
mode: "0644"
register: caddyfile_result
- name: Enable and start Caddy
ansible.builtin.systemd:
name: caddy
enabled: true
state: started
- name: Reload Caddy
ansible.builtin.systemd:
name: caddy
state: reloaded

View file

@ -0,0 +1,6 @@
{% for site in caddy_domains %}
{{ site.domain }} {
reverse_proxy localhost:{{ site.proxy_port }}
}
{% endfor %}

View file

@ -0,0 +1,110 @@
---
- name: Update apt cache
ansible.builtin.apt:
update_cache: true
cache_valid_time: 3600
- name: Install essential packages
ansible.builtin.apt:
name:
- apt-transport-https
- ca-certificates
- curl
- gnupg
- lsb-release
- ufw
- fail2ban
- git
state: present
# Firewall
- name: Configure UFW rules
ansible.builtin.shell: |
ufw allow 22/tcp
ufw allow 80/tcp
ufw allow 443/tcp
ufw allow {{ forgejo_ssh_port | default(2222) }}/tcp
ufw --force enable
ufw default deny incoming
changed_when: false
# Fail2ban
- name: Enable fail2ban
ansible.builtin.systemd:
name: fail2ban
enabled: true
state: started
# Docker
- name: Ensure keyrings directory exists
ansible.builtin.file:
path: /etc/apt/keyrings
state: directory
mode: "0755"
- name: Add Docker GPG key
ansible.builtin.shell:
cmd: curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc && chmod 644 /etc/apt/keyrings/docker.asc
creates: /etc/apt/keyrings/docker.asc
- name: Add Docker apt repository
ansible.builtin.apt_repository:
repo: "deb [arch=amd64 signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu {{ ansible_distribution_release }} stable"
state: present
- name: Install Docker
ansible.builtin.apt:
name:
- docker-ce
- docker-ce-cli
- containerd.io
- docker-buildx-plugin
state: present
update_cache: true
- name: Start Docker
ansible.builtin.systemd:
name: docker
enabled: true
state: started
# Docker network
- name: Create Docker network
ansible.builtin.shell: docker network inspect {{ docker_network }} >/dev/null 2>&1 || docker network create {{ docker_network }}
changed_when: false
# SSH key (for cloning from Forgejo)
- name: Generate SSH key
ansible.builtin.shell:
cmd: ssh-keygen -t ed25519 -f /root/.ssh/id_ed25519 -N "" -q
creates: /root/.ssh/id_ed25519
- name: Read SSH public key
ansible.builtin.command: cat /root/.ssh/id_ed25519.pub
register: ssh_public_key
changed_when: false
- name: Show SSH public key
ansible.builtin.debug:
msg: "Add this SSH key to Forgejo (Settings > SSH Keys): {{ ssh_public_key.stdout }}"
# App directories
- name: Create app directories
ansible.builtin.file:
path: "{{ item }}"
state: directory
mode: "0755"
loop:
- /opt/church-website
- "{{ repo_dir }}"
- "{{ envs_dir }}"
- "{{ scripts_dir }}"
- name: Create environment directories
ansible.builtin.file:
path: "{{ envs_dir }}/{{ item.name }}"
state: directory
mode: "0750"
loop: "{{ app_environments }}"
loop_control:
label: "{{ item.name }}"

View file

@ -0,0 +1,30 @@
---
- name: Create Forgejo directories
ansible.builtin.file:
path: "{{ item }}"
state: directory
mode: "0755"
loop:
- /opt/forgejo
- /opt/forgejo/data
- /opt/forgejo/runner
- name: Deploy Forgejo Docker Compose file
ansible.builtin.template:
src: docker-compose.forgejo.yml.j2
dest: /opt/forgejo/docker-compose.yml
mode: "0644"
- name: Start Forgejo services
ansible.builtin.shell: docker compose up -d
args:
chdir: /opt/forgejo
- name: Wait for Forgejo to be ready
ansible.builtin.uri:
url: "http://localhost:{{ forgejo_port }}"
status_code: 200
register: forgejo_health
retries: 15
delay: 5
until: forgejo_health.status == 200

View file

@ -0,0 +1,41 @@
services:
forgejo:
image: codeberg.org/forgejo/forgejo:9
container_name: {{ forgejo_container_name }}
restart: unless-stopped
networks:
- {{ docker_network }}
volumes:
- ./data:/data
- /etc/timezone:/etc/timezone:ro
- /etc/localtime:/etc/localtime:ro
ports:
- "127.0.0.1:{{ forgejo_port }}:3000"
- "{{ forgejo_ssh_port }}:22"
environment:
- USER_UID=1000
- USER_GID=1000
- FORGEJO__server__ROOT_URL=https://{{ forgejo_domain }}
- FORGEJO__server__SSH_DOMAIN={{ forgejo_domain }}
- FORGEJO__server__SSH_PORT={{ forgejo_ssh_port }}
- FORGEJO__actions__ENABLED=true
runner:
image: code.forgejo.org/forgejo/runner:6.2.2
container_name: forgejo-runner
command: forgejo-runner daemon
restart: unless-stopped
user: "0:0"
networks:
- {{ docker_network }}
volumes:
- ./runner:/data
- /var/run/docker.sock:/var/run/docker.sock
environment:
- DOCKER_HOST=unix:///var/run/docker.sock
depends_on:
- forgejo
networks:
{{ docker_network }}:
external: true

View file

@ -0,0 +1,63 @@
---
- name: Create PostgreSQL init script directory
ansible.builtin.file:
path: /opt/church-website/postgres-init
state: directory
mode: "0755"
- name: Deploy database init script
ansible.builtin.template:
src: init-databases.sh.j2
dest: /opt/church-website/postgres-init/init-databases.sh
mode: "0755"
- name: Check if PostgreSQL container exists
ansible.builtin.shell: docker ps -a --filter name=^{{ postgres_container_name }}$ --format '{{ '{{' }}.Status{{ '}}' }}'
register: postgres_status
changed_when: false
- name: Start PostgreSQL container
ansible.builtin.shell: |
docker run -d \
--name {{ postgres_container_name }} \
--restart unless-stopped \
--network {{ docker_network }} \
-v {{ postgres_volume }}:/var/lib/postgresql/data \
-v /opt/church-website/postgres-init:/docker-entrypoint-initdb.d:ro \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD={{ vault_postgres_root_password }} \
-p 127.0.0.1:5432:5432 \
{{ postgres_image }}
when: postgres_status.stdout == ""
- name: Wait for PostgreSQL to be ready
ansible.builtin.shell: docker exec {{ postgres_container_name }} pg_isready -U postgres
register: pg_ready
retries: 10
delay: 3
until: pg_ready.rc == 0
changed_when: false
- name: Create databases and users
ansible.builtin.shell: |
docker exec {{ postgres_container_name }} psql -U postgres -c "
DO \$\$
BEGIN
IF NOT EXISTS (SELECT FROM pg_catalog.pg_roles WHERE rolname = '{{ item.user }}') THEN
CREATE ROLE {{ item.user }} WITH LOGIN PASSWORD '{{ item.password }}';
END IF;
END
\$\$;
"
docker exec {{ postgres_container_name }} psql -U postgres -tc "SELECT 1 FROM pg_database WHERE datname = '{{ item.name }}'" | grep -q 1 || \
docker exec {{ postgres_container_name }} psql -U postgres -c "CREATE DATABASE {{ item.name }} OWNER {{ item.user }}"
loop: "{{ databases }}"
loop_control:
label: "{{ item.name }}"
- name: Enable PostGIS extension on each database
ansible.builtin.shell: |
docker exec {{ postgres_container_name }} psql -U postgres -d {{ item.name }} -c "CREATE EXTENSION IF NOT EXISTS postgis;"
loop: "{{ databases }}"
loop_control:
label: "{{ item.name }}"

View file

@ -0,0 +1,20 @@
#!/bin/bash
# This script runs on first PostgreSQL container start only
# (placed in /docker-entrypoint-initdb.d/)
set -e
{% for db in databases %}
echo "Creating database {{ db.name }} with user {{ db.user }}..."
psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" <<-EOSQL
CREATE USER {{ db.user }} WITH PASSWORD '{{ db.password }}';
CREATE DATABASE {{ db.name }} OWNER {{ db.user }};
GRANT ALL PRIVILEGES ON DATABASE {{ db.name }} TO {{ db.user }};
EOSQL
psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" -d {{ db.name }} <<-EOSQL
CREATE EXTENSION IF NOT EXISTS postgis;
EOSQL
{% endfor %}
echo "Database initialization complete."

61
infra/scripts/deploy.sh Executable file
View file

@ -0,0 +1,61 @@
#!/bin/bash
set -euo pipefail
ENV_NAME=$1 # "staging" or "test"
APP_PORT=$2 # 3001 or 3002
REPO_DIR="/opt/church-website/repo"
ENV_DIR="/opt/church-website/envs/${ENV_NAME}"
CONTAINER_NAME="app-${ENV_NAME}"
IMAGE_NAME="church-website:${ENV_NAME}"
MIGRATE_IMAGE="church-website-migrate:${ENV_NAME}"
NETWORK_NAME="church-website-net"
if [ ! -f "${ENV_DIR}/.env" ]; then
echo "Error: ${ENV_DIR}/.env not found"
exit 1
fi
echo "==> Building app image ${IMAGE_NAME}..."
docker build \
--build-arg NEXT_PUBLIC_SERVER_URL="$(grep NEXT_PUBLIC_SERVER_URL "${ENV_DIR}/.env" | cut -d= -f2-)" \
--build-arg NEXT_PUBLIC_SITE_ID="$(grep NEXT_PUBLIC_SITE_ID "${ENV_DIR}/.env" | cut -d= -f2-)" \
-t "${IMAGE_NAME}" \
"${REPO_DIR}"
echo "==> Building migration image..."
docker build \
--target builder \
--build-arg NEXT_PUBLIC_SERVER_URL="$(grep NEXT_PUBLIC_SERVER_URL "${ENV_DIR}/.env" | cut -d= -f2-)" \
--build-arg NEXT_PUBLIC_SITE_ID="$(grep NEXT_PUBLIC_SITE_ID "${ENV_DIR}/.env" | cut -d= -f2-)" \
-t "${MIGRATE_IMAGE}" \
"${REPO_DIR}"
echo "==> Running database migrations..."
docker run --rm \
--network "${NETWORK_NAME}" \
--env-file "${ENV_DIR}/.env" \
"${MIGRATE_IMAGE}" \
npx payload migrate
echo "==> Stopping old container..."
docker stop "${CONTAINER_NAME}" 2>/dev/null || true
docker rm "${CONTAINER_NAME}" 2>/dev/null || true
echo "==> Starting new container on port ${APP_PORT}..."
docker run -d \
--name "${CONTAINER_NAME}" \
--restart unless-stopped \
--network "${NETWORK_NAME}" \
--env-file "${ENV_DIR}/.env" \
-v "uploads-${ENV_NAME}-media:/app/media" \
-v "uploads-${ENV_NAME}-documents:/app/documents" \
-p "127.0.0.1:${APP_PORT}:3000" \
"${IMAGE_NAME}"
echo "==> Fixing volume permissions..."
docker exec -u 0 "${CONTAINER_NAME}" chown -R 1001:1001 /app/media /app/documents
echo "==> Cleaning up old images..."
docker image prune -f
echo "==> Done! ${ENV_NAME} deployed on port ${APP_PORT}"

View file

@ -5,9 +5,11 @@
"license": "MIT", "license": "MIT",
"type": "module", "type": "module",
"scripts": { "scripts": {
"prebuild": "node --env-file-if-exists=.env scripts/copy-favicon.mjs",
"build": "cross-env NODE_OPTIONS=--no-deprecation next build", "build": "cross-env NODE_OPTIONS=--no-deprecation next build",
"predev": "node --env-file-if-exists=.env scripts/copy-favicon.mjs",
"dev": "cross-env NODE_OPTIONS=--no-deprecation next dev", "dev": "cross-env NODE_OPTIONS=--no-deprecation next dev",
"devsafe": "rm -rf .next && cross-env NODE_OPTIONS=--no-deprecation next dev", "devsafe": "rm -rf .next && node --env-file-if-exists=.env scripts/copy-favicon.mjs && cross-env NODE_OPTIONS=--no-deprecation next dev",
"generate:types": "payload generate:types", "generate:types": "payload generate:types",
"lint": "cross-env NODE_OPTIONS=--no-deprecation next lint", "lint": "cross-env NODE_OPTIONS=--no-deprecation next lint",
"payload": "cross-env NODE_OPTIONS=--no-deprecation payload", "payload": "cross-env NODE_OPTIONS=--no-deprecation payload",

34
scripts/copy-favicon.mjs Normal file
View file

@ -0,0 +1,34 @@
// White-labeling glue: each client under sites/<id>/ ships its own icon.ico.
// Next.js's App Router picks up the favicon via the file convention at
// src/app/(home)/icon.ico, which can only point to one file. This script runs
// before `next dev` / `next build` (see predev/prebuild in package.json) and
// copies the right per-site icon into that location based on NEXT_PUBLIC_SITE_ID.
// The destination is gitignored so it's treated as a build artifact.
import { copyFileSync, existsSync, readdirSync, statSync } from 'node:fs'
import { dirname, join, resolve } from 'node:path'
import { fileURLToPath } from 'node:url'
const __dirname = dirname(fileURLToPath(import.meta.url))
const repoRoot = resolve(__dirname, '..')
// Mirror the default in src/config/site.ts so an unset env var produces the
// same site here as it does at runtime.
const siteId = process.env.NEXT_PUBLIC_SITE_ID || 'dreikoenige'
const sitesDir = join(repoRoot, 'sites')
const source = join(sitesDir, siteId, 'icon.ico')
const destination = join(repoRoot, 'src', 'app', '(home)', 'icon.ico')
// Fail loudly with the list of valid site ids — same UX as the runtime check
// in src/config/site.ts when an unknown NEXT_PUBLIC_SITE_ID is supplied.
if (!existsSync(source)) {
const available = readdirSync(sitesDir)
.filter((entry) => statSync(join(sitesDir, entry)).isDirectory())
.join(', ')
throw new Error(
`[copy-favicon] No icon.ico for site "${siteId}" at ${source}. Available sites: ${available}`,
)
}
copyFileSync(source, destination)
console.log(`[copy-favicon] ${siteId} → src/app/(home)/icon.ico`)

BIN
sites/chemnitz/icon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.2 KiB

View file

@ -169,11 +169,7 @@ Chemnitz"
style="opacity:1;fill:#000000" style="opacity:1;fill:#000000"
id="path102835" /> id="path102835" />
<path <path
d="m 673.3066,121.69963 v 25.19148 h -6.08917 v -25.19148 z m 0.83481,-7.34137 q 0,0.78569 -0.31919,1.47318 -0.3192,0.68749 -0.85936,1.2031 -0.51562,0.51562 -1.22766,0.83481 -0.71204,0.29464 -1.52229,0.29464 -0.7857,0 -1.49774,-0.29464 -0.68749,-0.31919 -1.2031,-0.83481 -0.51562,-0.51561 -0.83481,-1.2031 -0.29463,-0.68749 -0.29463,-1.47318 0,-0.81026 0.29463,-1.5223 0.31919,-0.71204 0.83481,-1.22765 0.51561,-0.51562 1.2031,-0.81025 0.71204,-0.31919 1.49774,-0.31919 0.81025,0 1.52229,0.31919 0.71204,0.29463 1.22766,0.81025 0.54016,0.51561 0.85936,1.22765 0.31919,0.71204 0.31919,1.5223 z" d="m 684.3066,146.89111 q -0.85936,0 -1.32587,-0.24553 -0.46651,-0.27008 -0.73659,-1.05578 l -0.54017,-1.79238 q -0.95757,0.85936 -1.86604,1.5223 -0.90846,0.63838 -1.89058,1.08033 -0.98213,0.44196 -2.11157,0.66294 -1.10489,0.22097 -2.45531,0.22097 -1.59595,0 -2.94637,-0.4174 -1.35042,-0.44195 -2.33255,-1.30131 -0.98212,-0.85936 -1.52229,-2.13612 -0.54017,-1.27676 -0.54017,-2.97093 0,-1.42408 0.73659,-2.79905 0.76115,-1.39953 2.50442,-2.50442 1.74327,-1.12944 4.64054,-1.86603 2.89726,-0.7366 7.21861,-0.83481 v -1.47319 q 0,-2.52896 -1.08034,-3.73207 -1.08033,-1.22765 -3.11824,-1.22765 -1.47319,0 -2.45531,0.34374 -0.95757,0.34375 -1.69417,0.7857 -0.73659,0.4174 -1.35042,0.76115 -0.58927,0.34374 -1.32586,0.34374 -0.63838,0 -1.08034,-0.31919 -0.44196,-0.34374 -0.71204,-0.7857 l -1.08034,-1.93969 q 4.3459,-3.97761 10.45962,-3.97761 2.20978,0 3.95305,0.7366 1.74327,0.71204 2.94638,2.01335 1.2031,1.27676 1.81692,3.06914 0.63839,1.79238 0.63839,3.92849 v 15.91041 z m -9.0601,-3.78117 q 1.86604,0 3.21646,-0.66294 1.37497,-0.68748 2.67629,-2.06246 v -4.24768 q -2.65174,0.12276 -4.44411,0.4665 -1.76783,0.3192 -2.84816,0.83481 -1.08034,0.51562 -1.54685,1.2031 -0.46651,0.68749 -0.46651,1.49774 0,1.59595 0.93302,2.28344 0.95757,0.68749 2.47986,0.68749 z"
style="opacity:1;fill:#000000"
id="path102837" />
<path
d="m 695.9691,146.89111 q -0.85936,0 -1.32587,-0.24553 -0.46651,-0.27008 -0.73659,-1.05578 l -0.54017,-1.79238 q -0.95757,0.85936 -1.86604,1.5223 -0.90846,0.63838 -1.89058,1.08033 -0.98213,0.44196 -2.11157,0.66294 -1.10489,0.22097 -2.45531,0.22097 -1.59595,0 -2.94637,-0.4174 -1.35042,-0.44195 -2.33255,-1.30131 -0.98212,-0.85936 -1.52229,-2.13612 -0.54017,-1.27676 -0.54017,-2.97093 0,-1.42408 0.73659,-2.79905 0.76115,-1.39953 2.50442,-2.50442 1.74327,-1.12944 4.64054,-1.86603 2.89726,-0.7366 7.21861,-0.83481 v -1.47319 q 0,-2.52896 -1.08034,-3.73207 -1.08033,-1.22765 -3.11824,-1.22765 -1.47319,0 -2.45531,0.34374 -0.95757,0.34375 -1.69417,0.7857 -0.73659,0.4174 -1.35042,0.76115 -0.58927,0.34374 -1.32586,0.34374 -0.63838,0 -1.08034,-0.31919 -0.44196,-0.34374 -0.71204,-0.7857 l -1.08034,-1.93969 q 4.3459,-3.97761 10.45962,-3.97761 2.20978,0 3.95305,0.7366 1.74327,0.71204 2.94638,2.01335 1.2031,1.27676 1.81692,3.06914 0.63839,1.79238 0.63839,3.92849 v 15.91041 z m -9.0601,-3.78117 q 1.86604,0 3.21646,-0.66294 1.37497,-0.68748 2.67629,-2.06246 v -4.24768 q -2.65174,0.12276 -4.44411,0.4665 -1.76783,0.3192 -2.84816,0.83481 -1.08034,0.51562 -1.54685,1.2031 -0.46651,0.68749 -0.46651,1.49774 0,1.59595 0.93302,2.28344 0.95757,0.68749 2.47986,0.68749 z"
style="opacity:1;fill:#000000" style="opacity:1;fill:#000000"
id="path102839" /> id="path102839" />
<path <path

View file

@ -168,11 +168,7 @@ Chemnitz"
style="opacity:1;fill:#000000" style="opacity:1;fill:#000000"
id="path102835" /> id="path102835" />
<path <path
d="m 673.3066,121.69963 v 25.19148 h -6.08917 v -25.19148 z m 0.83481,-7.34137 q 0,0.78569 -0.31919,1.47318 -0.3192,0.68749 -0.85936,1.2031 -0.51562,0.51562 -1.22766,0.83481 -0.71204,0.29464 -1.52229,0.29464 -0.7857,0 -1.49774,-0.29464 -0.68749,-0.31919 -1.2031,-0.83481 -0.51562,-0.51561 -0.83481,-1.2031 -0.29463,-0.68749 -0.29463,-1.47318 0,-0.81026 0.29463,-1.5223 0.31919,-0.71204 0.83481,-1.22765 0.51561,-0.51562 1.2031,-0.81025 0.71204,-0.31919 1.49774,-0.31919 0.81025,0 1.52229,0.31919 0.71204,0.29463 1.22766,0.81025 0.54016,0.51561 0.85936,1.22765 0.31919,0.71204 0.31919,1.5223 z" d="m 684.3066,146.89111 q -0.85936,0 -1.32587,-0.24553 -0.46651,-0.27008 -0.73659,-1.05578 l -0.54017,-1.79238 q -0.95757,0.85936 -1.86604,1.5223 -0.90846,0.63838 -1.89058,1.08033 -0.98213,0.44196 -2.11157,0.66294 -1.10489,0.22097 -2.45531,0.22097 -1.59595,0 -2.94637,-0.4174 -1.35042,-0.44195 -2.33255,-1.30131 -0.98212,-0.85936 -1.52229,-2.13612 -0.54017,-1.27676 -0.54017,-2.97093 0,-1.42408 0.73659,-2.79905 0.76115,-1.39953 2.50442,-2.50442 1.74327,-1.12944 4.64054,-1.86603 2.89726,-0.7366 7.21861,-0.83481 v -1.47319 q 0,-2.52896 -1.08034,-3.73207 -1.08033,-1.22765 -3.11824,-1.22765 -1.47319,0 -2.45531,0.34374 -0.95757,0.34375 -1.69417,0.7857 -0.73659,0.4174 -1.35042,0.76115 -0.58927,0.34374 -1.32586,0.34374 -0.63838,0 -1.08034,-0.31919 -0.44196,-0.34374 -0.71204,-0.7857 l -1.08034,-1.93969 q 4.3459,-3.97761 10.45962,-3.97761 2.20978,0 3.95305,0.7366 1.74327,0.71204 2.94638,2.01335 1.2031,1.27676 1.81692,3.06914 0.63839,1.79238 0.63839,3.92849 v 15.91041 z m -9.0601,-3.78117 q 1.86604,0 3.21646,-0.66294 1.37497,-0.68748 2.67629,-2.06246 v -4.24768 q -2.65174,0.12276 -4.44411,0.4665 -1.76783,0.3192 -2.84816,0.83481 -1.08034,0.51562 -1.54685,1.2031 -0.46651,0.68749 -0.46651,1.49774 0,1.59595 0.93302,2.28344 0.95757,0.68749 2.47986,0.68749 z"
style="opacity:1;fill:#000000"
id="path102837" />
<path
d="m 695.9691,146.89111 q -0.85936,0 -1.32587,-0.24553 -0.46651,-0.27008 -0.73659,-1.05578 l -0.54017,-1.79238 q -0.95757,0.85936 -1.86604,1.5223 -0.90846,0.63838 -1.89058,1.08033 -0.98213,0.44196 -2.11157,0.66294 -1.10489,0.22097 -2.45531,0.22097 -1.59595,0 -2.94637,-0.4174 -1.35042,-0.44195 -2.33255,-1.30131 -0.98212,-0.85936 -1.52229,-2.13612 -0.54017,-1.27676 -0.54017,-2.97093 0,-1.42408 0.73659,-2.79905 0.76115,-1.39953 2.50442,-2.50442 1.74327,-1.12944 4.64054,-1.86603 2.89726,-0.7366 7.21861,-0.83481 v -1.47319 q 0,-2.52896 -1.08034,-3.73207 -1.08033,-1.22765 -3.11824,-1.22765 -1.47319,0 -2.45531,0.34374 -0.95757,0.34375 -1.69417,0.7857 -0.73659,0.4174 -1.35042,0.76115 -0.58927,0.34374 -1.32586,0.34374 -0.63838,0 -1.08034,-0.31919 -0.44196,-0.34374 -0.71204,-0.7857 l -1.08034,-1.93969 q 4.3459,-3.97761 10.45962,-3.97761 2.20978,0 3.95305,0.7366 1.74327,0.71204 2.94638,2.01335 1.2031,1.27676 1.81692,3.06914 0.63839,1.79238 0.63839,3.92849 v 15.91041 z m -9.0601,-3.78117 q 1.86604,0 3.21646,-0.66294 1.37497,-0.68748 2.67629,-2.06246 v -4.24768 q -2.65174,0.12276 -4.44411,0.4665 -1.76783,0.3192 -2.84816,0.83481 -1.08034,0.51562 -1.54685,1.2031 -0.46651,0.68749 -0.46651,1.49774 0,1.59595 0.93302,2.28344 0.95757,0.68749 2.47986,0.68749 z"
style="opacity:1;fill:#000000" style="opacity:1;fill:#000000"
id="path102839" /> id="path102839" />
<path <path

Before

Width:  |  Height:  |  Size: 41 KiB

After

Width:  |  Height:  |  Size: 41 KiB

View file

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View file

@ -9,12 +9,12 @@ import { RefreshRouteOnSave } from '@/components/RefreshRouteOnSave/RefreshRoute
import { draftMode } from 'next/headers' import { draftMode } from 'next/headers'
type Props = { type Props = {
params: Promise<{ slug: string }> params: Promise<{ slug?: string[] }>
} }
export async function generateMetadata({ params }: Props): Promise<Metadata> { export async function generateMetadata({ params }: Props): Promise<Metadata> {
const slug = (await params).slug const slug = (await params).slug
const page = await fetchPageBySlug(slug) const page = await fetchPageBySlug(slug?.join('/') || "")
if (!page) return {} if (!page) return {}
@ -27,15 +27,15 @@ export async function generateMetadata({ params }: Props): Promise<Metadata> {
export default async function DynamicPage({ params }: Props) { export default async function DynamicPage({ params }: Props) {
const slug = (await params).slug const slug = (await params).slug
const { isEnabled: isDraft } = await draftMode() const { isEnabled: isDraft } = await draftMode()
const page = await fetchPageBySlug(slug, isDraft) const page = await fetchPageBySlug(slug?.join('/') || "", isDraft)
const authenticated = await isAuthenticated() const authenticated = await isAuthenticated()
if (!page) { if (!page) {
notFound() notFound()
} }
if(!authenticated && page._status !== "published") { if (!authenticated && page._status !== 'published') {
notFound(); notFound()
} }
const firstBlockType = page.content?.[0]?.blockType const firstBlockType = page.content?.[0]?.blockType
@ -52,7 +52,7 @@ export default async function DynamicPage({ params }: Props) {
)} )}
<AdminMenu <AdminMenu
collection={"pages"} collection={'pages'}
id={page.id} id={page.id}
isAuthenticated={authenticated} isAuthenticated={authenticated}
/> />

View file

@ -14,7 +14,7 @@ import { getPhoto } from '@/utils/dto/gallery'
import { isAuthenticated } from '@/utils/auth' import { isAuthenticated } from '@/utils/auth'
import { AdminMenu } from '@/components/AdminMenu/AdminMenu' import { AdminMenu } from '@/components/AdminMenu/AdminMenu'
import { GroupEvents } from '@/compositions/GroupEvents/GroupEvents' import { GroupEvents } from '@/compositions/GroupEvents/GroupEvents'
import { RichText } from '@payloadcms/richtext-lexical/react' import { RichText } from '@/components/Text/RichText'
import { RefreshRouteOnSave } from '@/components/RefreshRouteOnSave/RefreshRouteOnSave' import { RefreshRouteOnSave } from '@/components/RefreshRouteOnSave/RefreshRouteOnSave'
import { draftMode } from 'next/headers' import { draftMode } from 'next/headers'

View file

@ -1,7 +1,7 @@
import { PageHeader } from '@/compositions/PageHeader/PageHeader' import { PageHeader } from '@/compositions/PageHeader/PageHeader'
import { fetchDonationForm } from '@/fetch/donationform' import { fetchDonationForm } from '@/fetch/donationform'
import { notFound } from 'next/navigation' import { notFound } from 'next/navigation'
import { RichText } from '@payloadcms/richtext-lexical/react' import { RichText } from '@/components/Text/RichText'
import { Container } from '@/components/Container/Container' import { Container } from '@/components/Container/Container'
import { Section } from '@/components/Section/Section' import { Section } from '@/components/Section/Section'
import styles from '@/components/DonationForm/styles.module.scss' import styles from '@/components/DonationForm/styles.module.scss'

View file

@ -7,6 +7,7 @@ import { GalleryBlock } from '@/collections/blocks/Gallery'
import { DonationBlock } from '@/collections/blocks/Donation' import { DonationBlock } from '@/collections/blocks/Donation'
import { ButtonBlock } from '@/collections/blocks/Button' import { ButtonBlock } from '@/collections/blocks/Button'
import { YoutubePlayerBlock } from '@/collections/blocks/YoutubePlayer' import { YoutubePlayerBlock } from '@/collections/blocks/YoutubePlayer'
import { ImageCardsBlock } from '@/collections/blocks/ImageCards'
import { isPublishedPublic } from '@/collections/access/public' import { isPublishedPublic } from '@/collections/access/public'
@ -64,6 +65,7 @@ export const Blog: CollectionConfig = {
GalleryBlock, GalleryBlock,
YoutubePlayerBlock, YoutubePlayerBlock,
ButtonBlock, ButtonBlock,
ImageCardsBlock,
], ],
required: true, required: true,
}, },

View file

@ -28,10 +28,180 @@ export const Churches: CollectionConfig = {
type: 'textarea', type: 'textarea',
required: true, required: true,
}, },
{
name: 'recurringSchedule',
type: 'array',
label: {
de: 'Wiederkehrende Messzeiten',
},
labels: {
singular: {
de: 'Wiederkehrende Messzeit',
},
plural: {
de: 'Wiederkehrende Messzeiten',
},
},
admin: {
description:
'Wiederkehrende Gottesdienste werden automatisch für die nächsten Wochen als Einträge in der Gottesdienstliste angelegt. Einzelne Termine können dort weiterhin manuell bearbeitet (z. B. abgesagt) werden.',
},
fields: [
{
type: 'row',
fields: [
{
name: 'type',
label: {
de: 'Art',
},
type: 'select',
required: true,
options: [
{ label: 'Heilige Messe', value: 'MASS' },
{ label: 'Familien Messe', value: 'FAMILY' },
{ label: 'Wort-Gottes-Feier', value: 'WORD' },
],
},
{
name: 'frequency',
label: {
de: 'Häufigkeit',
},
type: 'select',
required: true,
defaultValue: 'weekly',
options: [
{ label: 'Wöchentlich', value: 'weekly' },
{ label: 'Alle 2 Wochen', value: 'biweekly' },
{
label: 'Monatlich (n-ter Wochentag)',
value: 'monthlyByWeekday',
},
],
},
],
},
{
type: 'row',
fields: [
{
name: 'day',
label: {
de: 'Tag',
},
type: 'select',
required: true,
options: [
{ label: 'Montag', value: 'monday' },
{ label: 'Dienstag', value: 'tuesday' },
{ label: 'Mittwoch', value: 'wednesday' },
{ label: 'Donnerstag', value: 'thursday' },
{ label: 'Freitag', value: 'friday' },
{ label: 'Samstag', value: 'saturday' },
{ label: 'Sonntag', value: 'sunday' },
],
},
{
name: 'time',
label: {
de: 'Uhrzeit',
},
type: 'date',
required: true,
admin: {
date: {
pickerAppearance: 'timeOnly',
timeIntervals: 15,
timeFormat: 'HH:mm',
},
},
},
],
},
{
name: 'weekOfMonth',
label: {
de: 'Wochentag im Monat',
},
type: 'select',
options: [
{ label: '1.', value: 'first' },
{ label: '2.', value: 'second' },
{ label: '3.', value: 'third' },
{ label: '4.', value: 'fourth' },
{ label: 'Letzter', value: 'last' },
],
admin: {
condition: (_data, siblingData) =>
siblingData?.frequency === 'monthlyByWeekday',
description:
'z. B. „3.“ + „Sonntag“ = jeden 3. Sonntag im Monat',
},
},
{
name: 'biweeklyAnchor',
label: {
de: 'Anker-Datum (alle 2 Wochen)',
},
type: 'date',
admin: {
date: {
pickerAppearance: 'dayOnly',
},
condition: (_data, siblingData) =>
siblingData?.frequency === 'biweekly',
description:
'Ein Datum, an dem dieser Termin stattfindet. Davon ausgehend wird im 2-Wochen-Rhythmus weitergerechnet.',
},
},
{
type: 'row',
fields: [
{
name: 'defaultCelebrant',
label: {
de: 'Standard-Zelebrant',
},
type: 'text',
},
{
name: 'defaultTitle',
label: {
de: 'Standard-Titel',
},
type: 'text',
},
],
},
{
name: 'defaultDescription',
label: {
de: 'Standard-Hinweise',
},
type: 'textarea',
admin: {
description:
'Wird als „Hinweise“ auf jeden automatisch erzeugten Gottesdienst übernommen, z. B. „Vor der Messe beten wir den Rosenkranz.“',
},
},
{
name: 'notes',
label: {
de: 'Notiz',
},
type: 'text',
admin: {
description:
'Nur intern, z. B. „Sommerferien: ausgesetzt“. Beeinflusst die Generierung nicht.',
},
},
],
},
], ],
admin: { admin: {
useAsTitle: 'name', useAsTitle: 'name',
hidden: hide hidden: hide,
}, },
access: { access: {
read: () => true, read: () => true,
@ -39,4 +209,22 @@ export const Churches: CollectionConfig = {
update: isAdminOrEmployee(), update: isAdminOrEmployee(),
delete: isAdminOrEmployee(), delete: isAdminOrEmployee(),
}, },
hooks: {
afterChange: [
async ({ doc, req }) => {
if (!doc.recurringSchedule?.length) return
try {
await req.payload.jobs.queue({
task: 'generateRecurringMasses',
input: { churchId: doc.id },
})
} catch (err) {
req.payload.logger.error(
{ err, churchId: doc.id },
'Failed to queue generateRecurringMasses job',
)
}
},
],
},
} }

View file

@ -20,6 +20,7 @@ import { CollapsibleImageWithTextBlock } from '@/collections/blocks/CollapsibleI
import { EventsBlock } from '@/collections/blocks/Events' import { EventsBlock } from '@/collections/blocks/Events'
import { PublicationAndNewsletterBlock } from '@/collections/blocks/PublicationAndNewsletter' import { PublicationAndNewsletterBlock } from '@/collections/blocks/PublicationAndNewsletter'
import { ContactPersonBlock } from '@/collections/blocks/ContactPersonBlock' import { ContactPersonBlock } from '@/collections/blocks/ContactPersonBlock'
import { ImageCardsBlock } from '@/collections/blocks/ImageCards'
import { isPublishedPublic } from '@/collections/access/public' import { isPublishedPublic } from '@/collections/access/public'
export const Pages: CollectionConfig = { export const Pages: CollectionConfig = {
@ -51,7 +52,8 @@ export const Pages: CollectionConfig = {
{ {
name: 'slug', name: 'slug',
type: 'text', type: 'text',
required: true, defaultValue: '',
required: false,
unique: true, unique: true,
label: { label: {
de: 'Slug', de: 'Slug',
@ -99,6 +101,7 @@ export const Pages: CollectionConfig = {
MassTimesBlock, MassTimesBlock,
EventsBlock, EventsBlock,
ContactPersonBlock, ContactPersonBlock,
ImageCardsBlock,
], ],
}, },
], ],

View file

@ -5,6 +5,8 @@ import { DocumentBlock } from '@/collections/blocks/Document'
import { DonationBlock } from '@/collections/blocks/Donation' import { DonationBlock } from '@/collections/blocks/Donation'
import { YoutubePlayerBlock } from '@/collections/blocks/YoutubePlayer' import { YoutubePlayerBlock } from '@/collections/blocks/YoutubePlayer'
import { DonationAppeal } from '@/collections/blocks/DonationAppeal' import { DonationAppeal } from '@/collections/blocks/DonationAppeal'
import { ImageCardsBlock } from '@/collections/blocks/ImageCards'
import { TitleBlock } from '@/collections/blocks/Title'
import { isPublishedPublic } from '@/collections/access/public' import { isPublishedPublic } from '@/collections/access/public'
export const Parish: CollectionConfig = { export const Parish: CollectionConfig = {
@ -118,7 +120,9 @@ export const Parish: CollectionConfig = {
DocumentBlock, DocumentBlock,
DonationBlock, DonationBlock,
YoutubePlayerBlock, YoutubePlayerBlock,
DonationAppeal DonationAppeal,
ImageCardsBlock,
TitleBlock,
] ]
}, },
{ {

View file

@ -74,6 +74,9 @@ export const Worship: CollectionConfig = {
label: { label: {
de: 'Abgesagt', de: 'Abgesagt',
}, },
admin: {
position: 'sidebar',
},
}, },
{ {
name: 'liturgicalDay', name: 'liturgicalDay',
@ -98,6 +101,18 @@ export const Worship: CollectionConfig = {
de: 'Hinweise', de: 'Hinweise',
}, },
}, },
{
name: 'generated',
type: 'checkbox',
defaultValue: false,
label: {
de: 'Automatisch erzeugt',
},
admin: {
readOnly: true,
position: 'sidebar',
},
},
], ],
admin: { admin: {
defaultColumns: ["date", 'location', 'type', 'celebrant'], defaultColumns: ["date", 'location', 'type', 'celebrant'],

View file

@ -0,0 +1,54 @@
import { Block } from 'payload'
export const ImageCardsBlock: Block = {
slug: 'imageCards',
labels: {
singular: { de: 'Bildkarten' },
plural: { de: 'Bildkarten' },
},
fields: [
{
name: 'items',
label: { de: 'Karten' },
type: 'array',
required: true,
minRows: 1,
fields: [
{
name: 'title',
label: { de: 'Titel' },
type: 'text',
required: true,
},
{
name: 'image',
label: { de: 'Bild' },
type: 'upload',
relationTo: 'media',
required: true,
},
{
name: 'link',
label: { de: 'Verknüpfung (Seite oder Gruppe)' },
type: 'relationship',
relationTo: ['pages', 'group'],
required: false,
admin: {
condition: (_, siblingData) => !siblingData?.customLink,
},
},
{
name: 'customLink',
label: { de: 'Eigener Link (URL)' },
type: 'text',
required: false,
admin: {
condition: (_, siblingData) => !siblingData?.link,
description:
'Alternative zu "Verknüpfung". Nur eins von beiden kann gesetzt werden.',
},
},
],
},
],
}

View file

@ -52,5 +52,22 @@ export const TitleBlock: Block = {
], ],
defaultValue: 'left', defaultValue: 'left',
}, },
{
name: 'color',
type: 'select',
label: {
de: 'Farbe',
},
required: true,
defaultValue: 'base',
options: [
{ label: 'Grundfarbe', value: 'base' },
{ label: 'Abstufung 1', value: 'shade1' },
{ label: 'Abstufung 2', value: 'shade2' },
{ label: 'Abstufung 3', value: 'shade3' },
{ label: 'Kontrastfarbe', value: 'contrast' },
{ label: 'Kontrast Abstufung 1', value: 'contrastShade1' },
],
},
], ],
} }

View file

@ -10,7 +10,7 @@ export type Church =
| 'antonius' | 'antonius'
| 'marien' | 'marien'
| 'maria' | 'maria'
| 'antoniusFalkenberg' | 'antoniusFrankenberg'
| 'johannesNepomuk' | 'johannesNepomuk'
type ChurchIconProps = { type ChurchIconProps = {
@ -59,15 +59,15 @@ export const ChurchIcon = ({church, style, stroke, color}: ChurchIconProps) => {
if (church === 'joseph') { if (church === 'joseph') {
return ( return (
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 243.78 243.78"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 238.46 281.04">
<g <g
fill={style === 'outline' ? '#ffffff' : color} fill={style === 'outline' ? '#ffffff' : color}
stroke={color} stroke={color}
strokeWidth={stroke} strokeWidth={stroke}
strokeMiterlimit={10} strokeMiterlimit={10}
> >
<path d="M146.38,160.38c0,6.94,3.11,12.38,7.09,12.38s7.09-5.44,7.09-12.38-3.11-12.38-7.09-12.38-7.09,5.44-7.09,12.38ZM153.47,150.48c2.18,0,4.61,4.07,4.61,9.9s-2.43,9.9-4.61,9.9-4.61-4.07-4.61-9.9,2.43-9.9,4.61-9.9Z" /> <path className="cls-1"
<path d="M175.61,75.38l-19.75-44.65v-9.17h2.99v-2.48h-2.99v-4.92h-2.48v4.92h-2.99v2.48h2.99v9.2l-16.95,43.7v45.87l-77.66,13.62v91.58h110.15v-52.78h6.69v-97.38ZM138.92,76.13l19.17-7.81v22.98l-19.17,6.79v-21.96ZM160.57,68.74l12.57,8.01v24.6l-12.57-9.77v-22.84ZM160.16,65.54l-9.21-21.47,3.75-9.8,16.9,38.57-11.44-7.29ZM149.69,47.37l7.99,18.43-17.83,7.27,9.83-25.7ZM61.25,136.04l88.3-15.49-4.73,5.68-7.82,9.38-75.76,14.12v-13.69ZM136.43,223.06H61.25v-70.8l55.46-10.34,19.72-3.68v84.81ZM166.44,223.06h-27.53v-85.86l8.11-9.73,5.41-6.49,14.01,27.36v74.71ZM168.92,170.28v-22.53l-14.77-28.84-.79-1.54-1.35.24-13.1,2.3v-19.17l20.18-7.16,14.03,10.91v65.8h-4.21Z" /> d="M174.61,59.65h0l-11.73-28.56v-5.84h2.84v-3h-2.84v-4.04h-3v4.04h-2.84v3h2.84v5.87l-9.56,26.23-8.2,26.37-.53,34.64-4.77-6.79v-5.84h2.34v-3h-2.34v-2.78h-3v2.78h-2.34v3h2.34v5.45l-29.91,17.18v6.51l-54.73,30.91,1.48,2.61,4.88-2.75,9,19.54v56.39l34.19,5.29v-57.46l-14.46-22.17-23.8-4.37,43.45-24.54v111.98l-51.66-8.65-.5,2.96,53.38,8.93.21.04,79.45-9.58V83.36l-10.18-23.71ZM95.7,243.31l-28.19-4.27v-52.4l28.19,4.5v52.18ZM94.06,187.79l-27.01-4.29-8.26-17.93,23.63,4.34,11.64,17.88ZM167.74,238.88l-22.69,2.43v-30.23c0-8.53,5.09-15.47,11.35-15.47s11.35,6.94,11.35,15.47v27.8ZM156.39,192.6c-7.91,0-14.35,8.29-14.35,18.47v30.56l-1.09.12v-50.34l18.42-20.5,3.08,5.01,10.15,16.81v45.62l-1.87.2v-27.48c0-10.19-6.44-18.47-14.35-18.47ZM116,193.16l21.96-.77v49.7l-21.96,2.55v-51.47ZM138.76,189.36l-20.06.71,13.79-11.9,17.9-1.75-11.63,12.94ZM181.79,104.01l-17.86-11.23-.4-18.62,18.27,10.91v18.94ZM162.99,39.27l8.46,20.6-8.04,8.04-.42-28.64ZM160.93,92.86l-16.09,8.66.25-16.23,15.44-10.95.4,18.53ZM144.79,104.96l16.21-8.73.86,40.05-6.29,2.72-11.06-16.43.27-17.6ZM161.93,139.51l.2,9.22-4.85-7.21,4.65-2.01ZM164,96.37l17.79,11.18v39.67l-16.92-10.86-.86-39.99ZM164.39,71.18l8.31-8.31,7.44,17.72-15.75-9.41ZM159.99,39.66l.41,28.07-7.67-7.88,7.26-20.19ZM151.7,63.09l7.97,8.18-13.35,9.47,5.38-17.65ZM106.89,250.37v-120.27l27.97-16.06,6.9,9.82,20.47,30.41v15.57l-2.42-3.92-6.46,7.19-22.11,2.16-18.26,15.98.04,56.11,35.17-3.91,27.44-3.05-.04-48.51-10.37-17.18v-21.36h-.01s-.29-13.38-.29-13.38l16.85,10.81v90.55l-74.89,9.03Z" />
</g> </g>
</svg> </svg>
) )
@ -82,7 +82,8 @@ export const ChurchIcon = ({church, style, stroke, color}: ChurchIconProps) => {
strokeWidth={stroke} strokeWidth={stroke}
strokeMiterlimit={10} strokeMiterlimit={10}
> >
<path d="M195.93,130.1v-51.2l-32.17-17.31v-7.08h4.6v-3.3h-4.6v-4.54h-3.3v4.54h-4.57v3.3h4.57v86.42l-41.6,4.88v13.59l-50.03,6.51v7.85l-39.86,3.82v32.41h39.43v3.82h50.46v5.51h78.77v-6.36h39.01v-88.78l-40.7,5.94ZM192.63,91.86l-28.87-15.68v-10.84l28.87,15.54v10.98ZM163.76,79.93l28.87,15.68v41.54l-28.87,3.39v-60.61ZM71.7,210.5v-10.38h-3.3v6.56h-36.13v-18.72l21.82-2.06.11-3.34-21.93,2.07v-4.07l36.55-3.5v8.04h3.3v-8.48l21.4-2.59.11-3.34-21.52,2.61v-4.5l46.73-6.08v47.77h-47.15ZM197.63,209.65v-11.23h-3.3v17.59h-72.17v-67.26l70.47-8.26v3.45l-25.25,3.14-.08,3.09,25.33-2.89v21.77h3.3v-35.6l37.4-5.45v3.76l-24.4,3.8v3.34s24.4-3.8,24.4-3.8v74.56h-35.71Z" /> <path
d="M195.93,130.1v-51.2l-32.17-17.31v-7.08h4.6v-3.3h-4.6v-4.54h-3.3v4.54h-4.57v3.3h4.57v86.42l-41.6,4.88v13.59l-50.03,6.51v7.85l-39.86,3.82v32.41h39.43v3.82h50.46v5.51h78.77v-6.36h39.01v-88.78l-40.7,5.94ZM192.63,91.86l-28.87-15.68v-10.84l28.87,15.54v10.98ZM163.76,79.93l28.87,15.68v41.54l-28.87,3.39v-60.61ZM71.7,210.5v-10.38h-3.3v6.56h-36.13v-18.72l21.82-2.06.11-3.34-21.93,2.07v-4.07l36.55-3.5v8.04h3.3v-8.48l21.4-2.59.11-3.34-21.52,2.61v-4.5l46.73-6.08v47.77h-47.15ZM197.63,209.65v-11.23h-3.3v17.59h-72.17v-67.26l70.47-8.26v3.45l-25.25,3.14-.08,3.09,25.33-2.89v21.77h3.3v-35.6l37.4-5.45v3.76l-24.4,3.8v3.34s24.4-3.8,24.4-3.8v74.56h-35.71Z" />
</g> </g>
</svg> </svg>
) )
@ -97,7 +98,8 @@ export const ChurchIcon = ({church, style, stroke, color}: ChurchIconProps) => {
strokeWidth={stroke} strokeWidth={stroke}
strokeMiterlimit={10} strokeMiterlimit={10}
> >
<path d="M102.08,85.55v-28.46L62.97,23.42v-10.6h3.68v-3h-3.68v-4.21h-3v4.21h-3.68v3h3.68v75.51l-24.92,23.7v95.05l39.92,6.65v.23h116.51V58.26l-89.4,27.29ZM38.05,204.54v-91.22l24.92-23.7V27.38l36.11,31.09v28l-8,2.44v-19.78c0-4.63-3.62-8.39-8.06-8.39s-8.06,3.77-8.06,8.39v141.57l-36.92-6.15ZM88.08,89.82l-10.12,3.09v-23.79c0-2.97,2.27-5.39,5.06-5.39s5.06,2.42,5.06,5.39v20.7ZM77.96,174.36h6.36v6.86h60.31v29.73h-66.67v-36.6ZM188.48,210.96h-40.85v-32.73h-60.31v-6.86h-9.36v-75.31l110.51-33.74v148.65Z" /> <path
d="M102.08,85.55v-28.46L62.97,23.42v-10.6h3.68v-3h-3.68v-4.21h-3v4.21h-3.68v3h3.68v75.51l-24.92,23.7v95.05l39.92,6.65v.23h116.51V58.26l-89.4,27.29ZM38.05,204.54v-91.22l24.92-23.7V27.38l36.11,31.09v28l-8,2.44v-19.78c0-4.63-3.62-8.39-8.06-8.39s-8.06,3.77-8.06,8.39v141.57l-36.92-6.15ZM88.08,89.82l-10.12,3.09v-23.79c0-2.97,2.27-5.39,5.06-5.39s5.06,2.42,5.06,5.39v20.7ZM77.96,174.36h6.36v6.86h60.31v29.73h-66.67v-36.6ZM188.48,210.96h-40.85v-32.73h-60.31v-6.86h-9.36v-75.31l110.51-33.74v148.65Z" />
<path d="M60.08,109.28c-4.44,0-8.06,3.77-8.06,8.39v46.88h16.12v-46.88c0-4.63-3.62-8.39-8.06-8.39ZM65.14,161.55h-10.12v-43.88c0-2.97,2.27-5.39,5.06-5.39s5.06,2.42,5.06,5.39v43.88Z" /> <path d="M60.08,109.28c-4.44,0-8.06,3.77-8.06,8.39v46.88h16.12v-46.88c0-4.63-3.62-8.39-8.06-8.39ZM65.14,161.55h-10.12v-43.88c0-2.97,2.27-5.39,5.06-5.39s5.06,2.42,5.06,5.39v43.88Z" />
<path d="M102.08,108.25c0-5.42-3.62-9.83-8.06-9.83s-8.06,4.41-8.06,9.83v56.31h16.12v-56.31ZM99.08,161.55h-10.12v-53.31c0-3.76,2.27-6.83,5.06-6.83s5.06,3.06,5.06,6.83v53.31Z" /> <path d="M102.08,108.25c0-5.42-3.62-9.83-8.06-9.83s-8.06,4.41-8.06,9.83v56.31h16.12v-56.31ZM99.08,161.55h-10.12v-53.31c0-3.76,2.27-6.83,5.06-6.83s5.06,3.06,5.06,6.83v53.31Z" />
<path d="M121.26,108.25c0-5.42-3.62-9.83-8.06-9.83s-8.06,4.41-8.06,9.83v56.31h16.12v-56.31ZM118.26,161.55h-10.12v-53.31c0-3.76,2.27-6.83,5.06-6.83s5.06,3.06,5.06,6.83v53.31Z" /> <path d="M121.26,108.25c0-5.42-3.62-9.83-8.06-9.83s-8.06,4.41-8.06,9.83v56.31h16.12v-56.31ZM118.26,161.55h-10.12v-53.31c0-3.76,2.27-6.83,5.06-6.83s5.06,3.06,5.06,6.83v53.31Z" />
@ -140,19 +142,21 @@ export const ChurchIcon = ({church, style, stroke, color}: ChurchIconProps) => {
) )
} }
if (church === 'antoniusFalkenberg') { if (church === 'antoniusFrankenberg') {
return ( return (
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 240.94 240.94"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 399.55 240.94">
<g <g
fill={style === 'outline' ? '#ffffff' : color} fill={style === 'outline' ? '#ffffff' : color}
stroke={color} stroke={color}
strokeWidth={stroke} strokeWidth={stroke}
strokeMiterlimit={10} strokeMiterlimit={10}
> >
<path d="M139.66,58.07c-4.06,0-7.36,3.3-7.36,7.36v31.72h14.72v-31.72c0-4.06-3.3-7.36-7.36-7.36ZM144.82,94.96h-10.33v-29.52c0-2.85,2.32-5.16,5.16-5.16s5.16,2.32,5.16,5.16v29.52Z" /> <path d="M191.57,76.17c0-6.45-4.23-11.7-9.42-11.7s-9.42,5.25-9.42,11.7v41.6h18.84v-41.6ZM188.57,114.77h-12.84v-38.6c0-4.8,2.88-8.7,6.42-8.7s6.42,3.9,6.42,8.7v38.6Z" />
<path d="M78.21,170.99h-25.57v17.85h25.57v-17.85ZM76.02,186.64h-21.18v-13.45h21.18v13.45Z" /> <path d="M253.05,110.23l-21.46-1.17v59.41l21.46-2.98v-55.27ZM250.05,162.88l-15.46,2.15v-52.8l15.46.84v49.81Z" />
<path d="M128.52,170.99h-25.57v17.85h25.57v-17.85ZM126.32,186.64h-21.18v-13.45h21.18v13.45Z" /> <path d="M283.82,111.4l-17.46-1.06v52.58l17.46-2.72v-48.8ZM280.82,157.64l-11.46,1.78v-45.89l11.46.7v43.41Z" />
<path d="M157.25,105.18v-59.28h-5.43v-16.57h4.33v-2.2h-4.33v-7.27h-2.2v7.27h-4.35v2.2h4.35v16.57h-3.83l.13,4.67h-22.96v25.79l-63.87-.63v23.61h-21.4v110.75h119.56v-.66l29-30.48v-33.29l-29-40.48ZM155.05,207.89H39.88v-43.97h115.17v43.97ZM155.05,161.73H39.88v-60.2h21.4v-23.59l63.87.63v-25.81h23.02l-.13-4.67h7.01v113.64ZM184.06,178.07l-26.81,28.18v-97.3l26.81,37.42v31.71Z" /> <path d="M309.01,113.18l-15.17-.98v47.38l15.17-2.51v-43.89ZM306.01,154.52l-9.17,1.51v-40.64l9.17.59v38.54Z" />
<path d="M322.04,154.82l12.59-2.35v-39.37l-12.59-.92v42.64ZM325.04,115.41l6.59.48v34.08l-6.59,1.23v-35.8Z" />
<path d="M374.14,126.83v-3.55h-14.93v-15.26l-133.16-9.69v-47.6l-17.39-3.95v-22.63h6.72v-3h-6.72v-7.4h-3v7.4h-6.73v3h6.73v22.45l-43.01,3.66v44.67l-109.35-3.66v30.25H13.5v76.95l195.16,23.7v-.65l17.39-6.38v-9.08l119.01-32.96v-.03s29.08,0,29.08,0v-6.97h-5.44v-3.41h-6.59v-5h-2.91v-30.86h14.93ZM83.46,203.93l-13.65-1.66v-15.11l13.65,1.17v15.6ZM121.97,208.6l-35.5-4.31v-18.72l-19.65-1.68v18.02l-50.32-6.11v-25.09l105.47,7.71v30.19ZM121.97,175.4l-105.47-7.71v-43.19h39.8v-30.15l65.66,2.2v78.85ZM156.8,212.83l-15.66-1.9v-17.48l15.66,1.2v18.18ZM197.91,217.83l-38.1-4.63v-21.33l-21.66-1.66v20.35l-13.17-1.6v-30.34l72.94,5.33v33.87ZM197.91,180.95l-72.94-5.33v-78.96l40.69,1.36v-45.01l32.25-2.75v130.69ZM205.66,218.77l-4.76-.58V50.01l4.76-.4v169.16ZM223.05,213.03l-14.39,5.28V49.85l14.39,3.27v159.91ZM314.03,178.57l-87.98,24.36v-23.18l87.98-15.9v14.71ZM314.03,160.81l-87.98,15.9v-75.37l87.98,6.4v53.07ZM343.46,170.42l-26.43,7.32v-14.43l26.43-4.78v11.88ZM343.46,123.28v32.21l-26.43,4.78v-52.31l39.18,2.85v12.47h-12.75ZM356.21,126.83v30.84l-9.75-.03v-30.81h9.75ZM360.11,159.84v4.84h6.59v3.41h5.44v1.97h-25.69v-10.22h13.66Z" />
</g> </g>
</svg> </svg>
) )
@ -167,9 +171,12 @@ export const ChurchIcon = ({church, style, stroke, color}: ChurchIconProps) => {
strokeWidth={stroke} strokeWidth={stroke}
strokeMiterlimit={10} strokeMiterlimit={10}
> >
<polygon points="126.83 186.79 128.83 186.79 128.83 178.02 131.13 178.02 131.13 176.02 128.83 176.02 128.83 173.63 126.83 173.63 126.83 176.02 124.53 176.02 124.53 178.02 126.83 178.02 126.83 186.79" /> <polygon
<path d="M269.06,100.09c-3.4,0-6.16,2.76-6.16,6.16v60.54h12.33v-60.54c0-3.4-2.77-6.16-6.16-6.16ZM273.23,164.8h-8.33v-58.54c0-2.3,1.87-4.16,4.16-4.16s4.16,1.87,4.16,4.16v58.54Z" /> points="126.83 186.79 128.83 186.79 128.83 178.02 131.13 178.02 131.13 176.02 128.83 176.02 128.83 173.63 126.83 173.63 126.83 176.02 124.53 176.02 124.53 178.02 126.83 178.02 126.83 186.79" />
<path d="M292.87,221.86c-.23-8.98-2.01-79.99-1.59-103.42.15-8.37-2.06-14.68-6.58-18.77-5.25-4.75-11.99-5.05-14.64-4.96v-7.68c2.13-.41,3.8-2.08,4.21-4.21h2.02v-2h-2.02c-.41-2.13-2.08-3.81-4.21-4.21v-11.57h-2v11.57c-2.13.41-3.81,2.08-4.21,4.21h-2.02v2h2.02c.41,2.13,2.09,3.81,4.21,4.21v7.79c-6.03.44-10.95,4.82-10.95,9.96l-.56,51.84v8.23l-121.62-15.61-49.35,41.76-16.01,1.19v4.85l4.65.16v3.41l-16.19,13.13v70.19l31.69,1.2v.07l11.38.36,2.57.1h0s12.4.47,12.4.47h0s13.77.52,13.77.52l.88.03h0s12.59.48,12.59.48l1.21.05h0s35.05,1.33,35.05,1.33l1.04.04v-.1l34.9-2.33h0s9.11-.61,9.11-.61l13.04-.87h0s10.82-.72,10.82-.72h0s11.17-.74,11.17-.74h0s10.82-.72,10.82-.72h0s11.6-.77,11.6-.77h0s10.82-.72,10.82-.72v-59.13h-.02ZM289.28,118.41c-.42,23.36,1.35,93.87,1.59,103.38l-8.79-.31h0s0,0,0,0l-1.58-116.62c0-2.82-1.32-5.36-3.44-7.2,2.11.65,4.35,1.72,6.32,3.51,4.06,3.68,6.05,9.48,5.91,17.24ZM284.08,279.58v-56.02l6.82.24v55.33l-6.82.45ZM270.06,84.97v-2.14h2.14c-.33,1.02-1.12,1.82-2.14,2.14ZM272.2,80.83h-2.14v-2.14c1.02.33,1.82,1.12,2.14,2.14ZM268.06,78.68v2.14h-2.14c.33-1.02,1.12-1.82,2.14-2.14ZM265.92,82.83h2.14v2.14c-1.02-.33-1.82-1.12-2.14-2.14ZM258.55,156.64l.56-51.84c0-4.07,4.01-7.55,8.95-7.97l1.04-.04c5.18.02,9.39,3.64,9.39,8.09l1.55,115.17-21.5-20.35v-43.06ZM256.55,166.87v33.69l21.98,20.81-8.06-.28h0s-8.1-.28-8.1-.28l-36.03-1.25v-.03l-10.82-.36v.02l-35.34-1.23-43.43-65.96,119.81,14.88ZM264.2,280.9l-2.54.17v-58.29l.76.03,6.06.21v57.6l-4.28.29ZM246.49,282.08l-2.32.15-4.5.3v-60.5l6.82.23v59.82ZM221.52,283.75l-4,.27v-62.77l6.82.24v62.07l-2.82.19ZM201.62,285.07l-6.82.45v-64.99l6.82.23v64.31ZM134.5,152.23l43.23,65.64-30.06-1.04,12.92-19.46v-4.42l-34.07-5.27-37.52,3.05,45.5-38.5ZM157.59,197.75l-10.71,16.62v-15.26l10.71-1.36ZM157.94,196.13l-13.24,1.41-72.82-2.39v-1.53l54.55-3.93,31.5,4.67v1.77ZM135.72,199.26l6.82.23v85.65l-6.82-.27v-85.61ZM76.23,282.56v-85.3l6.82.23v85.33l-6.82-.26ZM74.23,203.23v11.07l-12.57-.43,12.57-10.63ZM60.04,215.81l14.19.49v66.25l-14.19-.54v-66.2ZM100.13,283.53l-8.4-.32v-19.62l8.4.27v19.67ZM114.08,284.06l-3.67-.14-4.73-.18v-19.77l8.4.22v19.86ZM120.33,284.28v-20.06l8.4.2v20.19l-8.4-.33ZM118.33,262.17v22.04l-2.25-.09v-21.89l-12.4-.33v21.75l-1.55-.06v-21.68l-12.4-.39v21.6l-1.8-.07v-85.41l45.78,1.54v85.61l-2.98-.11v-22.22l-12.4-.3ZM178.63,286.5l-31.75-1.2v-66.5l31.75,1.1v66.61ZM180.63,219.97l12.17.42v65.27l-12.17.81v-66.51ZM203.62,284.94v-64.18l11.9.41v62.97l-11.9.79ZM226.34,283.43v-61.87l11.33.39v60.72l-11.33.76ZM248.49,281.95v-59.63l11.17.39v58.49l-11.17.74ZM270.48,280.48v-57.4l10.25.36,1.35,1.28v54.99l-11.6.77Z" /> <path
d="M269.06,100.09c-3.4,0-6.16,2.76-6.16,6.16v60.54h12.33v-60.54c0-3.4-2.77-6.16-6.16-6.16ZM273.23,164.8h-8.33v-58.54c0-2.3,1.87-4.16,4.16-4.16s4.16,1.87,4.16,4.16v58.54Z" />
<path
d="M292.87,221.86c-.23-8.98-2.01-79.99-1.59-103.42.15-8.37-2.06-14.68-6.58-18.77-5.25-4.75-11.99-5.05-14.64-4.96v-7.68c2.13-.41,3.8-2.08,4.21-4.21h2.02v-2h-2.02c-.41-2.13-2.08-3.81-4.21-4.21v-11.57h-2v11.57c-2.13.41-3.81,2.08-4.21,4.21h-2.02v2h2.02c.41,2.13,2.09,3.81,4.21,4.21v7.79c-6.03.44-10.95,4.82-10.95,9.96l-.56,51.84v8.23l-121.62-15.61-49.35,41.76-16.01,1.19v4.85l4.65.16v3.41l-16.19,13.13v70.19l31.69,1.2v.07l11.38.36,2.57.1h0s12.4.47,12.4.47h0s13.77.52,13.77.52l.88.03h0s12.59.48,12.59.48l1.21.05h0s35.05,1.33,35.05,1.33l1.04.04v-.1l34.9-2.33h0s9.11-.61,9.11-.61l13.04-.87h0s10.82-.72,10.82-.72h0s11.17-.74,11.17-.74h0s10.82-.72,10.82-.72h0s11.6-.77,11.6-.77h0s10.82-.72,10.82-.72v-59.13h-.02ZM289.28,118.41c-.42,23.36,1.35,93.87,1.59,103.38l-8.79-.31h0s0,0,0,0l-1.58-116.62c0-2.82-1.32-5.36-3.44-7.2,2.11.65,4.35,1.72,6.32,3.51,4.06,3.68,6.05,9.48,5.91,17.24ZM284.08,279.58v-56.02l6.82.24v55.33l-6.82.45ZM270.06,84.97v-2.14h2.14c-.33,1.02-1.12,1.82-2.14,2.14ZM272.2,80.83h-2.14v-2.14c1.02.33,1.82,1.12,2.14,2.14ZM268.06,78.68v2.14h-2.14c.33-1.02,1.12-1.82,2.14-2.14ZM265.92,82.83h2.14v2.14c-1.02-.33-1.82-1.12-2.14-2.14ZM258.55,156.64l.56-51.84c0-4.07,4.01-7.55,8.95-7.97l1.04-.04c5.18.02,9.39,3.64,9.39,8.09l1.55,115.17-21.5-20.35v-43.06ZM256.55,166.87v33.69l21.98,20.81-8.06-.28h0s-8.1-.28-8.1-.28l-36.03-1.25v-.03l-10.82-.36v.02l-35.34-1.23-43.43-65.96,119.81,14.88ZM264.2,280.9l-2.54.17v-58.29l.76.03,6.06.21v57.6l-4.28.29ZM246.49,282.08l-2.32.15-4.5.3v-60.5l6.82.23v59.82ZM221.52,283.75l-4,.27v-62.77l6.82.24v62.07l-2.82.19ZM201.62,285.07l-6.82.45v-64.99l6.82.23v64.31ZM134.5,152.23l43.23,65.64-30.06-1.04,12.92-19.46v-4.42l-34.07-5.27-37.52,3.05,45.5-38.5ZM157.59,197.75l-10.71,16.62v-15.26l10.71-1.36ZM157.94,196.13l-13.24,1.41-72.82-2.39v-1.53l54.55-3.93,31.5,4.67v1.77ZM135.72,199.26l6.82.23v85.65l-6.82-.27v-85.61ZM76.23,282.56v-85.3l6.82.23v85.33l-6.82-.26ZM74.23,203.23v11.07l-12.57-.43,12.57-10.63ZM60.04,215.81l14.19.49v66.25l-14.19-.54v-66.2ZM100.13,283.53l-8.4-.32v-19.62l8.4.27v19.67ZM114.08,284.06l-3.67-.14-4.73-.18v-19.77l8.4.22v19.86ZM120.33,284.28v-20.06l8.4.2v20.19l-8.4-.33ZM118.33,262.17v22.04l-2.25-.09v-21.89l-12.4-.33v21.75l-1.55-.06v-21.68l-12.4-.39v21.6l-1.8-.07v-85.41l45.78,1.54v85.61l-2.98-.11v-22.22l-12.4-.3ZM178.63,286.5l-31.75-1.2v-66.5l31.75,1.1v66.61ZM180.63,219.97l12.17.42v65.27l-12.17.81v-66.51ZM203.62,284.94v-64.18l11.9.41v62.97l-11.9.79ZM226.34,283.43v-61.87l11.33.39v60.72l-11.33.76ZM248.49,281.95v-59.63l11.17.39v58.49l-11.17.74ZM270.48,280.48v-57.4l10.25.36,1.35,1.28v54.99l-11.6.77Z" />
</g> </g>
</svg> </svg>
) )

View file

@ -1 +1 @@
<?xml version="1.0" encoding="UTF-8"?><svg id="Ebene_4" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 240.94 240.94"><defs><style>.cls-1{fill:none;stroke:#181715;stroke-miterlimit:10;stroke-width:.25px;}</style></defs><path class="cls-1" d="M139.66,58.07c-4.06,0-7.36,3.3-7.36,7.36v31.72h14.72v-31.72c0-4.06-3.3-7.36-7.36-7.36ZM144.82,94.96h-10.33v-29.52c0-2.85,2.32-5.16,5.16-5.16s5.16,2.32,5.16,5.16v29.52Z"/><path class="cls-1" d="M78.21,170.99h-25.57v17.85h25.57v-17.85ZM76.02,186.64h-21.18v-13.45h21.18v13.45Z"/><path class="cls-1" d="M128.52,170.99h-25.57v17.85h25.57v-17.85ZM126.32,186.64h-21.18v-13.45h21.18v13.45Z"/><path class="cls-1" d="M157.25,105.18v-59.28h-5.43v-16.57h4.33v-2.2h-4.33v-7.27h-2.2v7.27h-4.35v2.2h4.35v16.57h-3.83l.13,4.67h-22.96v25.79l-63.87-.63v23.61h-21.4v110.75h119.56v-.66l29-30.48v-33.29l-29-40.48ZM155.05,207.89H39.88v-43.97h115.17v43.97ZM155.05,161.73H39.88v-60.2h21.4v-23.59l63.87.63v-25.81h23.02l-.13-4.67h7.01v113.64ZM184.06,178.07l-26.81,28.18v-97.3l26.81,37.42v31.71Z"/></svg> <?xml version="1.0" encoding="UTF-8"?><svg id="Ebene_4" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 399.55 240.94"><defs><style>.cls-1{fill:none;stroke:#181715;stroke-miterlimit:10;stroke-width:.25px;}</style></defs><path class="cls-1" d="M191.57,76.17c0-6.45-4.23-11.7-9.42-11.7s-9.42,5.25-9.42,11.7v41.6h18.84v-41.6ZM188.57,114.77h-12.84v-38.6c0-4.8,2.88-8.7,6.42-8.7s6.42,3.9,6.42,8.7v38.6Z"/><path class="cls-1" d="M253.05,110.23l-21.46-1.17v59.41l21.46-2.98v-55.27ZM250.05,162.88l-15.46,2.15v-52.8l15.46.84v49.81Z"/><path class="cls-1" d="M283.82,111.4l-17.46-1.06v52.58l17.46-2.72v-48.8ZM280.82,157.64l-11.46,1.78v-45.89l11.46.7v43.41Z"/><path class="cls-1" d="M309.01,113.18l-15.17-.98v47.38l15.17-2.51v-43.89ZM306.01,154.52l-9.17,1.51v-40.64l9.17.59v38.54Z"/><path class="cls-1" d="M322.04,154.82l12.59-2.35v-39.37l-12.59-.92v42.64ZM325.04,115.41l6.59.48v34.08l-6.59,1.23v-35.8Z"/><path class="cls-1" d="M374.14,126.83v-3.55h-14.93v-15.26l-133.16-9.69v-47.6l-17.39-3.95v-22.63h6.72v-3h-6.72v-7.4h-3v7.4h-6.73v3h6.73v22.45l-43.01,3.66v44.67l-109.35-3.66v30.25H13.5v76.95l195.16,23.7v-.65l17.39-6.38v-9.08l119.01-32.96v-.03s29.08,0,29.08,0v-6.97h-5.44v-3.41h-6.59v-5h-2.91v-30.86h14.93ZM83.46,203.93l-13.65-1.66v-15.11l13.65,1.17v15.6ZM121.97,208.6l-35.5-4.31v-18.72l-19.65-1.68v18.02l-50.32-6.11v-25.09l105.47,7.71v30.19ZM121.97,175.4l-105.47-7.71v-43.19h39.8v-30.15l65.66,2.2v78.85ZM156.8,212.83l-15.66-1.9v-17.48l15.66,1.2v18.18ZM197.91,217.83l-38.1-4.63v-21.33l-21.66-1.66v20.35l-13.17-1.6v-30.34l72.94,5.33v33.87ZM197.91,180.95l-72.94-5.33v-78.96l40.69,1.36v-45.01l32.25-2.75v130.69ZM205.66,218.77l-4.76-.58V50.01l4.76-.4v169.16ZM223.05,213.03l-14.39,5.28V49.85l14.39,3.27v159.91ZM314.03,178.57l-87.98,24.36v-23.18l87.98-15.9v14.71ZM314.03,160.81l-87.98,15.9v-75.37l87.98,6.4v53.07ZM343.46,170.42l-26.43,7.32v-14.43l26.43-4.78v11.88ZM343.46,123.28v32.21l-26.43,4.78v-52.31l39.18,2.85v12.47h-12.75ZM356.21,126.83v30.84l-9.75-.03v-30.81h9.75ZM360.11,159.84v4.84h6.59v3.41h5.44v1.97h-25.69v-10.22h13.66Z"/></svg>

Before

Width:  |  Height:  |  Size: 1,019 B

After

Width:  |  Height:  |  Size: 2 KiB

View file

@ -1 +1 @@
<?xml version="1.0" encoding="UTF-8"?><svg id="Ebene_4" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 243.78 243.78"><defs><style>.cls-1{fill:none;stroke:#181715;stroke-miterlimit:10;stroke-width:.25px;}</style></defs><path class="cls-1" d="M146.38,160.38c0,6.94,3.11,12.38,7.09,12.38s7.09-5.44,7.09-12.38-3.11-12.38-7.09-12.38-7.09,5.44-7.09,12.38ZM153.47,150.48c2.18,0,4.61,4.07,4.61,9.9s-2.43,9.9-4.61,9.9-4.61-4.07-4.61-9.9,2.43-9.9,4.61-9.9Z"/><path class="cls-1" d="M175.61,75.38l-19.75-44.65v-9.17h2.99v-2.48h-2.99v-4.92h-2.48v4.92h-2.99v2.48h2.99v9.2l-16.95,43.7v45.87l-77.66,13.62v91.58h110.15v-52.78h6.69v-97.38ZM138.92,76.13l19.17-7.81v22.98l-19.17,6.79v-21.96ZM160.57,68.74l12.57,8.01v24.6l-12.57-9.77v-22.84ZM160.16,65.54l-9.21-21.47,3.75-9.8,16.9,38.57-11.44-7.29ZM149.69,47.37l7.99,18.43-17.83,7.27,9.83-25.7ZM61.25,136.04l88.3-15.49-4.73,5.68-7.82,9.38-75.76,14.12v-13.69ZM136.43,223.06H61.25v-70.8l55.46-10.34,19.72-3.68v84.81ZM166.44,223.06h-27.53v-85.86l8.11-9.73,5.41-6.49,14.01,27.36v74.71ZM168.92,170.28v-22.53l-14.77-28.84-.79-1.54-1.35.24-13.1,2.3v-19.17l20.18-7.16,14.03,10.91v65.8h-4.21Z"/></svg> <?xml version="1.0" encoding="UTF-8"?><svg id="Ebene_4" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 238.46 281.04"><defs><style>.cls-1{fill:none;stroke:#181715;stroke-miterlimit:10;stroke-width:.25px;}</style></defs><path class="cls-1" d="M133.99,166.6c7.44,0,13.5-6.52,13.5-14.53s-6.05-14.53-13.5-14.53-13.5,6.52-13.5,14.53,6.05,14.53,13.5,14.53ZM133.99,140.53c5.79,0,10.5,5.17,10.5,11.53s-4.71,11.53-10.5,11.53-10.5-5.17-10.5-11.53,4.71-11.53,10.5-11.53Z"/><path class="cls-1" d="M174.61,59.65h0l-11.73-28.56v-5.84h2.84v-3h-2.84v-4.04h-3v4.04h-2.84v3h2.84v5.87l-9.56,26.23-8.2,26.37-.53,34.64-4.77-6.79v-5.84h2.34v-3h-2.34v-2.78h-3v2.78h-2.34v3h2.34v5.45l-29.91,17.18v6.51l-54.73,30.91,1.48,2.61,4.88-2.75,9,19.54v56.39l34.19,5.29v-57.46l-14.46-22.17-23.8-4.37,43.45-24.54v111.98l-51.66-8.65-.5,2.96,53.38,8.93.21.04,79.45-9.58V83.36l-10.18-23.71ZM95.7,243.31l-28.19-4.27v-52.4l28.19,4.5v52.18ZM94.06,187.79l-27.01-4.29-8.26-17.93,23.63,4.34,11.64,17.88ZM167.74,238.88l-22.69,2.43v-30.23c0-8.53,5.09-15.47,11.35-15.47s11.35,6.94,11.35,15.47v27.8ZM156.39,192.6c-7.91,0-14.35,8.29-14.35,18.47v30.56l-1.09.12v-50.34l18.42-20.5,3.08,5.01,10.15,16.81v45.62l-1.87.2v-27.48c0-10.19-6.44-18.47-14.35-18.47ZM116,193.16l21.96-.77v49.7l-21.96,2.55v-51.47ZM138.76,189.36l-20.06.71,13.79-11.9,17.9-1.75-11.63,12.94ZM181.79,104.01l-17.86-11.23-.4-18.62,18.27,10.91v18.94ZM162.99,39.27l8.46,20.6-8.04,8.04-.42-28.64ZM160.93,92.86l-16.09,8.66.25-16.23,15.44-10.95.4,18.53ZM144.79,104.96l16.21-8.73.86,40.05-6.29,2.72-11.06-16.43.27-17.6ZM161.93,139.51l.2,9.22-4.85-7.21,4.65-2.01ZM164,96.37l17.79,11.18v39.67l-16.92-10.86-.86-39.99ZM164.39,71.18l8.31-8.31,7.44,17.72-15.75-9.41ZM159.99,39.66l.41,28.07-7.67-7.88,7.26-20.19ZM151.7,63.09l7.97,8.18-13.35,9.47,5.38-17.65ZM106.89,250.37v-120.27l27.97-16.06,6.9,9.82,20.47,30.41v15.57l-2.42-3.92-6.46,7.19-22.11,2.16-18.26,15.98.04,56.11,35.17-3.91,27.44-3.05-.04-48.51-10.37-17.18v-21.36h-.01s-.29-13.38-.29-13.38l16.85,10.81v90.55l-74.89,9.03Z"/></svg>

Before

Width:  |  Height:  |  Size: 1.1 KiB

After

Width:  |  Height:  |  Size: 1.9 KiB

View file

@ -1,10 +1,9 @@
"use client" "use client"
import classNames from 'classnames'
import styles from '@/components/Classifieds/styles.module.scss' import styles from '@/components/Classifieds/styles.module.scss'
import { useState } from 'react' import { useState } from 'react'
import { SerializedEditorState } from 'lexical' import { SerializedEditorState } from 'lexical'
import { RichText } from '@payloadcms/richtext-lexical/react' import { RichText } from '@/components/Text/RichText'
type AdProps = { type AdProps = {
text: SerializedEditorState, text: SerializedEditorState,

View file

@ -4,17 +4,17 @@ import Link from 'next/link'
export type ImageCardProps = { export type ImageCardProps = {
src: string, src: string,
title: string, title: string,
href: string href?: string
} }
export const ImageCard = ({src, title, href}: ImageCardProps) => { export const ImageCard = ({src, title, href}: ImageCardProps) => {
return ( const card = (
<Link href={href}> <div className={styles.container} style={{ backgroundImage: `url(${encodeURI(src)})` }}>
<div className={styles.container} style={{ backgroundImage: `url(${encodeURI(src)})` }}> <div className={styles.title}>
<div className={styles.title}> {title}
{title}
</div>
</div> </div>
</Link> </div>
) )
}
return href ? <Link href={href}>{card}</Link> : card
}

View file

@ -1,7 +1,7 @@
import styles from "./html.module.scss" import styles from "./html.module.scss"
import classNames from 'classnames' import classNames from 'classnames'
import { RichText } from '@payloadcms/richtext-lexical/react'
import { SerializedEditorState, SerializedLexicalNode } from 'lexical' import { SerializedEditorState, SerializedLexicalNode } from 'lexical'
import { RichText } from './RichText'
type HTMLTextProps = { type HTMLTextProps = {
width: "1/2" | "3/4", width: "1/2" | "3/4",

View file

@ -0,0 +1,16 @@
import { RichText as PayloadRichText } from '@payloadcms/richtext-lexical/react'
import { SerializedEditorState } from 'lexical'
import { jsxConverters } from './converters'
type RichTextProps = {
data: SerializedEditorState
}
// Thin wrapper around Payload's RichText that always wires up our custom JSX
// converters (e.g. rendering links marked as "Button" via the Button component).
// Use this everywhere instead of importing RichText directly from Payload, so
// the converters can never be forgotten.
export const RichText = ({ data }: RichTextProps) => (
<PayloadRichText data={data} converters={jsxConverters} />
)

View file

@ -0,0 +1,50 @@
import type { JSXConvertersFunction } from '@payloadcms/richtext-lexical/react'
import type { SerializedLinkNode } from '@payloadcms/richtext-lexical'
import { Button } from '@/components/Button/Button'
// Lexical link nodes carry their editable metadata (url, newTab, linkType, ...)
// in `fields`. We extend that shape with the custom `appearance` select that is
// added to LinkFeature in `payload.config.ts`, so editors can mark a link as a
// call-to-action button.
type LinkFields = SerializedLinkNode['fields'] & {
appearance?: 'link' | 'button'
}
// Custom JSX converters passed to <RichText /> wherever rich text is rendered.
// Only the `link` converter is overridden — every other node keeps Payload's
// default rendering via the spread of `defaultConverters`.
export const jsxConverters: JSXConvertersFunction = ({ defaultConverters }) => ({
...defaultConverters,
link: (args) => {
const { node, nodesToJSX } = args
const fields = node.fields as LinkFields
// Normal link → delegate to Payload's built-in link converter.
// The default converter from the package is a function, but the type
// also allows a static ReactNode, so we narrow before calling.
if (fields?.appearance !== 'button') {
const defaultLink = defaultConverters.link
return typeof defaultLink === 'function' ? defaultLink(args) : defaultLink
}
// Button link → resolve href the same way Payload's default converter does:
// internal links point at the related doc's slug, custom links use `url`.
const href =
fields.linkType === 'internal' && typeof fields.doc?.value === 'object'
? `/${(fields.doc.value as { slug?: string }).slug ?? ''}`
: (fields.url ?? '#')
// Render the existing Button component. Schema and size are intentionally
// hardcoded — editors only choose link vs. button, not the styling.
return (
<Button
href={href}
size="md"
schema="contrast"
target={fields.newTab ? '_blank' : '_self'}
>
{nodesToJSX({ nodes: node.children })}
</Button>
)
},
})

View file

@ -7,7 +7,7 @@ type TitleProps = {
align?: 'left' | 'center'; align?: 'left' | 'center';
size?: 'xl' | 'lg' | 'md' | "sm"; size?: 'xl' | 'lg' | 'md' | "sm";
fontStyle?: 'serif' | 'sans-serif' fontStyle?: 'serif' | 'sans-serif'
color?: "base" | "contrast" | "white", color?: "base" | "shade1" | "shade2" | "shade3" | "contrast" | "contrastShade1" | "white",
cancelled?: boolean, cancelled?: boolean,
} }
@ -17,7 +17,11 @@ export const Title = ({title, subtitle, align = "left", size = "lg", fontStyle =
<h2 className={classNames({ <h2 className={classNames({
[styles.title]: true, [styles.title]: true,
[styles.base]: color === "base", [styles.base]: color === "base",
[styles.shade1]: color === "shade1",
[styles.shade2]: color === "shade2",
[styles.shade3]: color === "shade3",
[styles.contrast]: color === "contrast", [styles.contrast]: color === "contrast",
[styles.contrastShade1]: color === "contrastShade1",
[styles.white]: color === "white", [styles.white]: color === "white",
[styles.extraLarge]: size === "xl", [styles.extraLarge]: size === "xl",
[styles.large]: size === "lg", [styles.large]: size === "lg",
@ -33,6 +37,10 @@ export const Title = ({title, subtitle, align = "left", size = "lg", fontStyle =
[styles.subtitle]: true, [styles.subtitle]: true,
[styles.base]: color === "contrast", [styles.base]: color === "contrast",
[styles.contrast]: color === "base", [styles.contrast]: color === "base",
[styles.shade1]: color === "shade1",
[styles.shade2]: color === "shade2",
[styles.shade3]: color === "shade3",
[styles.contrastShade1]: color === "contrastShade1",
[styles.white]: color === "white", [styles.white]: color === "white",
[styles.small]: ["xl", "lg"].includes(size), [styles.small]: ["xl", "lg"].includes(size),
[styles.left]: align === "left", [styles.left]: align === "left",

View file

@ -16,10 +16,26 @@
color: $base-color; color: $base-color;
} }
.shade1 {
color: $shade1;
}
.shade2 {
color: $shade2;
}
.shade3 {
color: $shade3;
}
.contrast { .contrast {
color: $contrast-color; color: $contrast-color;
} }
.contrastShade1 {
color: $contrast-shade1;
}
.white { .white {
color: $shade3; color: $shade3;
} }

View file

@ -20,6 +20,7 @@ import { getPhoto } from '@/utils/dto/gallery'
import { BlogSliderBlock } from '@/compositions/Blocks/BlogSliderBlock' import { BlogSliderBlock } from '@/compositions/Blocks/BlogSliderBlock'
import { MassTimesBlock } from '@/compositions/Blocks/MassTimesBlock' import { MassTimesBlock } from '@/compositions/Blocks/MassTimesBlock'
import { EventsBlock } from '@/compositions/Blocks/EventsBlock' import { EventsBlock } from '@/compositions/Blocks/EventsBlock'
import { ImageCardsBlock } from '@/compositions/Blocks/ImageCardsBlock'
type BlocksProps = { type BlocksProps = {
content: Blog['content']['content'] | NonNullable<Page['content']> content: Blog['content']['content'] | NonNullable<Page['content']>
@ -137,6 +138,7 @@ export function Blocks({ content }: BlocksProps) {
subtitle={item.subtitle || undefined} subtitle={item.subtitle || undefined}
size={item.size as 'xl' | 'lg' | 'md' | 'sm' | undefined} size={item.size as 'xl' | 'lg' | 'md' | 'sm' | undefined}
align={item.align as 'left' | 'center' | undefined} align={item.align as 'left' | 'center' | undefined}
color={item.color as 'base' | 'shade1' | 'shade2' | 'shade3' | 'contrast' | 'contrastShade1' | undefined}
/> />
</Container> </Container>
) )
@ -233,6 +235,10 @@ export function Blocks({ content }: BlocksProps) {
) )
} }
if (item.blockType === 'imageCards') {
return <ImageCardsBlock key={item.id} items={item.items} />
}
if (item.blockType === 'contactPersonBlock') { if (item.blockType === 'contactPersonBlock') {
const contact = typeof item.contact === 'object' const contact = typeof item.contact === 'object'
? item.contact ? item.contact

View file

@ -1,8 +1,13 @@
import { Highlight } from '@/payload-types'
import { fetchEvents } from '@/fetch/events' import { fetchEvents } from '@/fetch/events'
import { fetchHighlights } from '@/fetch/highlights'
import { transformEvents } from '@/utils/dto/events' import { transformEvents } from '@/utils/dto/events'
import { highlightLink } from '@/utils/dto/highlight'
import { Section } from '@/components/Section/Section' import { Section } from '@/components/Section/Section'
import { Title } from '@/components/Title/Title' import { Title } from '@/components/Title/Title'
import { EventRow } from '@/components/EventRow/EventRow'
import { Events } from '@/compositions/Events/Events' import { Events } from '@/compositions/Events/Events'
import { ContentWithSlider } from '@/compositions/ContentWithSlider/ContentWithSlider'
type EventsBlockProps = { type EventsBlockProps = {
title?: string | null title?: string | null
@ -14,18 +19,45 @@ export async function EventsBlock({
itemsPerPage = 6, itemsPerPage = 6,
}: EventsBlockProps) { }: EventsBlockProps) {
const events = await fetchEvents() const events = await fetchEvents()
const docs = events?.docs || [] const eventDocs = events?.docs || []
if (docs.length === 0) return null if (eventDocs.length === 0) return null
const highlights = await fetchHighlights()
const highlightDocs = highlights?.docs || []
return ( return (
<Section> <ContentWithSlider slider={<HighlightsSlider highlights={highlightDocs} />}>
<Title color={'contrast'} title={title || 'Veranstaltungen'} /> <Section>
<Events <Title color={'contrast'} title={title || 'Veranstaltungen'} />
events={transformEvents(docs)} <Events
n={itemsPerPage || 6} events={transformEvents(eventDocs)}
schema={'contrast'} n={itemsPerPage || 6}
/> schema={'contrast'}
</Section> />
</Section>
</ContentWithSlider>
) )
} }
const HighlightsSlider = ({ highlights }: { highlights: Highlight[] }) => (
<>
<Title
title={'Aktuelle Highlights'}
size={'md'}
fontStyle={'sans-serif'}
color={'white'}
/>
{highlights.map((highlight) => (
<EventRow
color={'white'}
key={highlight.id}
date={highlight.date}
showDate={false}
title={highlight.text}
href={highlightLink(highlight)}
cancelled={false}
/>
))}
</>
)

View file

@ -0,0 +1,57 @@
import { Group, Media, Page } from '@/payload-types'
import { Container } from '@/components/Container/Container'
import { Section } from '@/components/Section/Section'
import { ImageCard } from '@/components/ImageCard/ImageCard'
import { getPhoto } from '@/utils/dto/gallery'
import styles from './imageCardsBlock.module.scss'
export type ImageCardsBlockItem = {
title: string
image: string | Media
link?:
| { relationTo: 'pages'; value: string | Page }
| { relationTo: 'group'; value: string | Group }
| null
customLink?: string | null
id?: string | null
}
type ImageCardsBlockProps = {
items: ImageCardsBlockItem[]
}
const resolveHref = (item: ImageCardsBlockItem): string | undefined => {
if (item.link && typeof item.link.value === 'object') {
if (item.link.relationTo === 'pages' && item.link.value.slug) {
return `/${item.link.value.slug}`
}
if (item.link.relationTo === 'group' && item.link.value.slug) {
return `/gruppe/${item.link.value.slug}`
}
}
return item.customLink ?? undefined
}
export const ImageCardsBlock = ({ items }: ImageCardsBlockProps) => {
return (
<Container>
<Section padding={"small"}>
<div className={styles.grid}>
{items.map((item) => {
const photo = getPhoto('thumbnail', item.image)
if (!photo) return null
return (
<ImageCard
key={item.id ?? item.title}
src={photo.src}
title={item.title}
href={resolveHref(item)}
/>
)
})}
</div>
</Section>
</Container>
)
}

View file

@ -6,6 +6,8 @@ import { Button } from '@/components/Button/Button'
import { DonationForm } from '@/components/DonationForm/DonationForm' import { DonationForm } from '@/components/DonationForm/DonationForm'
import { YoutubePlayer } from '@/components/YoutubePlayer/YoutubePlayer' import { YoutubePlayer } from '@/components/YoutubePlayer/YoutubePlayer'
import { DonationAppeal } from '@/components/DonationAppeal/DonationAppeal' import { DonationAppeal } from '@/components/DonationAppeal/DonationAppeal'
import { ImageCardsBlock } from '@/compositions/Blocks/ImageCardsBlock'
import { Title } from '@/components/Title/Title'
type BlocksProps = { type BlocksProps = {
content: Parish['content'] content: Parish['content']
@ -55,6 +57,24 @@ export function ParishBlocks({ content }: BlocksProps) {
</Section> </Section>
} }
if (item.blockType === "imageCards") {
return <ImageCardsBlock key={item.id} items={item.items} />
}
if (item.blockType === "title") {
return (
<Container key={item.id}>
<Title
title={item.title}
subtitle={item.subtitle || undefined}
size={item.size as 'xl' | 'lg' | 'md' | 'sm' | undefined}
align={item.align as 'left' | 'center' | undefined}
color={item.color as 'base' | 'shade1' | 'shade2' | 'shade3' | 'contrast' | 'contrastShade1' | undefined}
/>
</Container>
)
}
if (item.blockType === "donationAppeal") { if (item.blockType === "donationAppeal") {
return <Section key={item.id} padding={"small"}> return <Section key={item.id} padding={"small"}>
<Container> <Container>

View file

@ -0,0 +1,13 @@
.grid {
display: grid;
gap: 1.5rem;
grid-template-columns: 1fr;
@media (min-width: 640px) {
grid-template-columns: repeat(2, 1fr);
}
@media (min-width: 1024px) {
grid-template-columns: repeat(3, 1fr);
}
}

View file

@ -31,7 +31,7 @@ export const Footer = async () => {
</Col> </Col>
<Col> <Col>
<Row gap={30}> <Row gap={25}>
{footer.groups?.map((group, i) => ( {footer.groups?.map((group, i) => (
<Col key={i}> <Col key={i}>
<p> <p>

View file

@ -25,6 +25,7 @@
.list li { .list li {
margin-left: -10px; margin-left: -10px;
white-space: nowrap;
} }
.list li:hover::marker{ .list li:hover::marker{

View file

@ -57,7 +57,7 @@ export const fetchWorship = async (
location: true, location: true,
title: true, title: true,
}, },
limit: 15, limit: 100,
}) as Promise<PaginatedDocs<Worship>> }) as Promise<PaginatedDocs<Worship>>
} }

View file

@ -0,0 +1,197 @@
import type { TaskConfig } from 'payload'
import {
combineDateAndTime,
generateOccurrenceDates,
type ScheduleEntry,
} from '@/jobs/lib/scheduleOccurrences'
/**
* Payload Jobs Queue task that materializes each Church's
* `recurringSchedule` into real `Worship` documents for the coming weeks.
*
* Rationale: the rest of the app (MassTimesBlock, gemeinde pages,
* /gottesdienst/{id} detail pages) all read from the Worship collection
* via `fetchWorship`. By generating real documents instead of rendering
* from the schedule on the fly, we keep a single source of truth and
* avoid duplicating rendering logic. Generated docs are normal Worship
* documents and can be edited by hand (cancel, change celebrant, etc.).
*
* Triggers:
* - `afterChange` hook on Church (see collections/Churches.ts) queues
* a run whenever a schedule is saved
* - Own `schedule` entry below runs weekly to keep the rolling window
* populated
* - Manual: call `payload.jobs.queue({ task: 'generateRecurringMasses' })`
*/
// How far into the future we materialize documents on each run.
// The weekly cron keeps this rolling window populated.
const DEFAULT_WEEKS_AHEAD = 2
const MS_PER_WEEK = 7 * 24 * 60 * 60 * 1000
type GenerateRecurringMassesInput = {
weeksAhead?: number
churchId?: string
}
type GenerateRecurringMassesOutput = {
created: number
skipped: number
}
export const generateRecurringMassesTask: TaskConfig<{
input: GenerateRecurringMassesInput
output: GenerateRecurringMassesOutput
}> = {
slug: 'generateRecurringMasses',
label: 'Wiederkehrende Messzeiten erzeugen',
inputSchema: [
{
name: 'weeksAhead',
type: 'number',
required: false,
},
{
name: 'churchId',
type: 'text',
required: false,
},
],
outputSchema: [
{ name: 'created', type: 'number' },
{ name: 'skipped', type: 'number' },
],
// Weekly cron to keep the rolling window populated even if nobody
// edits a schedule. Payload's 6-field cron format starts with seconds.
// → every Monday at 03:00 server time.
schedule: [
{
cron: '0 0 3 * * 1',
queue: 'default',
},
],
handler: async ({ input, req }) => {
const { payload } = req
const weeksAhead = input?.weeksAhead ?? DEFAULT_WEEKS_AHEAD
const now = new Date()
const horizon = new Date(now.getTime() + weeksAhead * MS_PER_WEEK)
// Scope to a single church when invoked from the Church afterChange
// hook, otherwise process every church in one run.
const churchesResult = await payload.find({
collection: 'church',
depth: 0,
limit: 1000,
pagination: false,
where: input?.churchId
? { id: { equals: input.churchId } }
: undefined,
})
let created = 0
let skipped = 0
for (const church of churchesResult.docs) {
// Cast needed because payload-types only exposes recurringSchedule
// on the full Church interface and payload.find returns a looser
// shape at depth: 0.
const schedule = (church as { recurringSchedule?: unknown[] })
.recurringSchedule
if (!Array.isArray(schedule) || schedule.length === 0) continue
for (const rawEntry of schedule) {
const entry = rawEntry as ScheduleEntry & {
time?: string | Date | null
type?: 'MASS' | 'FAMILY' | 'WORD'
defaultCelebrant?: string | null
defaultTitle?: string | null
defaultDescription?: string | null
}
// Guard: required fields may be missing if an editor saved a
// half-filled row. Skip rather than crash the whole run.
if (!entry.time || !entry.type) {
skipped += 1
continue
}
// Resolve the entry's recurrence pattern (weekly / biweekly /
// monthly Nth weekday) into concrete calendar dates in the
// [now, horizon] window. Returns dates at midnight only — we
// combine with the time-of-day below.
const occurrenceDates = generateOccurrenceDates(entry, now, horizon)
if (occurrenceDates.length === 0) {
skipped += 1
continue
}
const timeSource = new Date(entry.time)
for (const occurrenceDate of occurrenceDates) {
// Build the real Worship.date as (calendar date) + (HH:mm from
// the schedule) using local-time components. This is the step
// that keeps DST transitions from shifting the wall-clock hour.
const date = combineDateAndTime(occurrenceDate, timeSource)
// `now` could land mid-week: skip any occurrence that already
// passed earlier today.
if (date.getTime() < now.getTime()) {
skipped += 1
continue
}
// Append-only: if *any* Worship doc already exists at this
// exact (church, timestamp) slot — whether generated by a
// previous run or entered manually — leave it alone. This is
// what protects admin edits (cancellations, celebrant changes,
// etc.) from being overwritten.
const existing = await payload.find({
collection: 'worship',
depth: 0,
limit: 1,
pagination: false,
where: {
and: [
{ location: { equals: church.id } },
{ date: { equals: date.toISOString() } },
],
},
})
if (existing.docs.length > 0) {
skipped += 1
continue
}
// `generated: true` marks this as auto-created so future
// cleanup tooling can target it without touching manual rows.
await payload.create({
collection: 'worship',
data: {
date: date.toISOString(),
location: church.id,
type: entry.type,
cancelled: false,
title: entry.defaultTitle || undefined,
celebrant: entry.defaultCelebrant || undefined,
description: entry.defaultDescription || undefined,
generated: true,
},
})
created += 1
}
}
}
// Counts surface on the Payload Jobs admin page as the task output.
payload.logger.info(
{ created, skipped, weeksAhead },
'generateRecurringMasses finished',
)
return {
output: { created, skipped },
}
},
}

View file

@ -0,0 +1,219 @@
import { describe, it, expect } from 'vitest'
import {
combineDateAndTime,
generateOccurrenceDates,
type ScheduleEntry,
} from './scheduleOccurrences'
// All dates in this file use the local-time Date constructor so they
// are independent of the machine's timezone. Months are 0-indexed:
// new Date(2026, 3, 13) === 13 April 2026 local time
const d = (year: number, month1to12: number, day: number): Date =>
new Date(year, month1to12 - 1, day)
// Reference calendar used throughout these tests:
// April 2026 — Wed 1, Sun 5/12/19/26, Mon 6/13/20/27 (only 4 Mondays)
// March 2026 — Sun 1, Mon 2/9/16/23/30 (five Mondays, last = 30)
describe('generateOccurrenceDates weekly', () => {
it('yields the next N sundays in the window', () => {
const entry: ScheduleEntry = { frequency: 'weekly', day: 'sunday' }
const result = generateOccurrenceDates(entry, d(2026, 4, 6), d(2026, 4, 27))
expect(result.map((x) => x.getDate())).toEqual([12, 19, 26])
})
it('includes the start day itself if it matches the target weekday', () => {
const entry: ScheduleEntry = { frequency: 'weekly', day: 'sunday' }
// 2026-04-05 is a Sunday
const result = generateOccurrenceDates(entry, d(2026, 4, 5), d(2026, 4, 19))
expect(result.map((x) => x.getDate())).toEqual([5, 12, 19])
})
it('returns an empty array when the window is shorter than a week and misses the target', () => {
const entry: ScheduleEntry = { frequency: 'weekly', day: 'sunday' }
// Mon -> Sat, no Sunday in range
const result = generateOccurrenceDates(entry, d(2026, 4, 6), d(2026, 4, 11))
expect(result).toEqual([])
})
it('works for weekdays other than sunday', () => {
const entry: ScheduleEntry = { frequency: 'weekly', day: 'wednesday' }
// Apr 1 is a Wednesday
const result = generateOccurrenceDates(entry, d(2026, 4, 1), d(2026, 4, 30))
expect(result.map((x) => x.getDate())).toEqual([1, 8, 15, 22, 29])
})
})
describe('generateOccurrenceDates biweekly', () => {
it('yields dates aligned with the anchor, every two weeks', () => {
const entry: ScheduleEntry = {
frequency: 'biweekly',
day: 'sunday',
biweeklyAnchor: d(2026, 4, 5), // a Sunday
}
const result = generateOccurrenceDates(entry, d(2026, 4, 6), d(2026, 5, 31))
// from anchor: 4/5, 4/19, 5/3, 5/17, 5/31 — window starts after 4/5
expect(result.map((x) => `${x.getMonth() + 1}-${x.getDate()}`)).toEqual([
'4-19',
'5-3',
'5-17',
'5-31',
])
})
it('skips the off-parity week when the naive first-occurrence lands on it', () => {
const entry: ScheduleEntry = {
frequency: 'biweekly',
day: 'sunday',
biweeklyAnchor: d(2026, 4, 5),
}
// Start on 4/12 — that's a Sunday but off-parity (7 days from anchor).
// Next valid occurrence should be 4/19, not 4/12.
const result = generateOccurrenceDates(entry, d(2026, 4, 12), d(2026, 4, 26))
expect(result.map((x) => x.getDate())).toEqual([19])
})
it('handles anchor in the future by rolling backward in parity', () => {
const entry: ScheduleEntry = {
frequency: 'biweekly',
day: 'sunday',
biweeklyAnchor: d(2026, 6, 7), // future Sunday
}
const result = generateOccurrenceDates(entry, d(2026, 4, 6), d(2026, 5, 31))
// anchor parity backward: 6/7 - 14 = 5/24, - 14 = 5/10, -14 = 4/26, -14 = 4/12
expect(result.map((x) => `${x.getMonth() + 1}-${x.getDate()}`)).toEqual([
'4-12',
'4-26',
'5-10',
'5-24',
])
})
it('returns empty when biweeklyAnchor is missing', () => {
const entry: ScheduleEntry = { frequency: 'biweekly', day: 'sunday' }
const result = generateOccurrenceDates(entry, d(2026, 4, 1), d(2026, 4, 30))
expect(result).toEqual([])
})
})
describe('generateOccurrenceDates monthlyByWeekday', () => {
it('yields the 3rd sunday each month', () => {
const entry: ScheduleEntry = {
frequency: 'monthlyByWeekday',
day: 'sunday',
weekOfMonth: 'third',
}
const result = generateOccurrenceDates(entry, d(2026, 4, 1), d(2026, 6, 30))
// Apr 19, May 17, Jun 21 — let me verify May/June:
// May 2026: Sun 3, 10, 17, 24, 31 → third = 17
// Jun 2026: Sun 7, 14, 21, 28 → third = 21
expect(result.map((x) => `${x.getMonth() + 1}-${x.getDate()}`)).toEqual([
'4-19',
'5-17',
'6-21',
])
})
it('yields the last sunday each month', () => {
const entry: ScheduleEntry = {
frequency: 'monthlyByWeekday',
day: 'sunday',
weekOfMonth: 'last',
}
const result = generateOccurrenceDates(entry, d(2026, 4, 1), d(2026, 6, 30))
// Apr 26, May 31, Jun 28
expect(result.map((x) => `${x.getMonth() + 1}-${x.getDate()}`)).toEqual([
'4-26',
'5-31',
'6-28',
])
})
it('treats "fourth" as the 4th occurrence, not the last', () => {
// March 2026 has 5 Mondays (2, 9, 16, 23, 30).
// "fourth" should be 23, NOT 30.
const entry: ScheduleEntry = {
frequency: 'monthlyByWeekday',
day: 'monday',
weekOfMonth: 'fourth',
}
const result = generateOccurrenceDates(entry, d(2026, 3, 1), d(2026, 3, 31))
expect(result.map((x) => x.getDate())).toEqual([23])
})
it('works when the start-of-window is after that month\'s occurrence', () => {
// 3rd Sunday of April 2026 is 4/19. Start the window on 4/20.
// April should be skipped; May should still appear.
const entry: ScheduleEntry = {
frequency: 'monthlyByWeekday',
day: 'sunday',
weekOfMonth: 'third',
}
const result = generateOccurrenceDates(entry, d(2026, 4, 20), d(2026, 5, 31))
expect(result.map((x) => `${x.getMonth() + 1}-${x.getDate()}`)).toEqual([
'5-17',
])
})
it('returns empty when weekOfMonth is missing', () => {
const entry: ScheduleEntry = {
frequency: 'monthlyByWeekday',
day: 'sunday',
}
const result = generateOccurrenceDates(entry, d(2026, 4, 1), d(2026, 4, 30))
expect(result).toEqual([])
})
})
describe('generateOccurrenceDates edge cases', () => {
it('returns empty when before < after', () => {
const entry: ScheduleEntry = { frequency: 'weekly', day: 'sunday' }
const result = generateOccurrenceDates(entry, d(2026, 4, 30), d(2026, 4, 1))
expect(result).toEqual([])
})
it('returns empty for an unknown day value', () => {
const entry = {
frequency: 'weekly',
day: 'someday',
} as unknown as ScheduleEntry
const result = generateOccurrenceDates(entry, d(2026, 4, 1), d(2026, 4, 30))
expect(result).toEqual([])
})
})
describe('combineDateAndTime', () => {
it('copies HH:mm from timeSource onto the calendar date', () => {
const date = d(2026, 4, 19)
const time = new Date(1970, 0, 1, 10, 30)
const combined = combineDateAndTime(date, time)
expect(combined.getFullYear()).toBe(2026)
expect(combined.getMonth()).toBe(3)
expect(combined.getDate()).toBe(19)
expect(combined.getHours()).toBe(10)
expect(combined.getMinutes()).toBe(30)
expect(combined.getSeconds()).toBe(0)
expect(combined.getMilliseconds()).toBe(0)
})
it('zeroes seconds and milliseconds even when the source has them', () => {
const date = d(2026, 4, 19)
const time = new Date(1970, 0, 1, 18, 45, 42, 123)
const combined = combineDateAndTime(date, time)
expect(combined.getHours()).toBe(18)
expect(combined.getMinutes()).toBe(45)
expect(combined.getSeconds()).toBe(0)
expect(combined.getMilliseconds()).toBe(0)
})
it('ignores the year/month/day of the timeSource', () => {
const date = d(2026, 4, 19)
const time = new Date(1999, 11, 31, 8, 0)
const combined = combineDateAndTime(date, time)
expect(combined.getFullYear()).toBe(2026)
expect(combined.getMonth()).toBe(3)
expect(combined.getDate()).toBe(19)
expect(combined.getHours()).toBe(8)
})
})

View file

@ -0,0 +1,189 @@
export type ScheduleDay =
| 'monday'
| 'tuesday'
| 'wednesday'
| 'thursday'
| 'friday'
| 'saturday'
| 'sunday'
export type ScheduleFrequency = 'weekly' | 'biweekly' | 'monthlyByWeekday'
export type WeekOfMonth = 'first' | 'second' | 'third' | 'fourth' | 'last'
export interface ScheduleEntry {
frequency: ScheduleFrequency
day: ScheduleDay
weekOfMonth?: WeekOfMonth | null
biweeklyAnchor?: string | Date | null
}
// JS getDay(): 0 = Sunday ... 6 = Saturday
const DAY_INDEX: Record<ScheduleDay, number> = {
sunday: 0,
monday: 1,
tuesday: 2,
wednesday: 3,
thursday: 4,
friday: 5,
saturday: 6,
}
const WEEK_OF_MONTH_N: Record<WeekOfMonth, number> = {
first: 1,
second: 2,
third: 3,
fourth: 4,
last: -1,
}
const MS_PER_DAY = 24 * 60 * 60 * 1000
const startOfDay = (d: Date): Date => {
const r = new Date(d)
r.setHours(0, 0, 0, 0)
return r
}
const addDays = (d: Date, n: number): Date => {
const r = new Date(d)
r.setDate(r.getDate() + n)
return r
}
const firstOccurrenceOnOrAfter = (from: Date, targetDay: number): Date => {
const start = startOfDay(from)
const diff = (targetDay - start.getDay() + 7) % 7
return addDays(start, diff)
}
/**
* For a given (year, month), return the date of the Nth occurrence of
* `targetDay`. `n` is 1..4 for first..fourth, or -1 for "last".
* Returns null if the month has no such occurrence (e.g. asking for the
* 5th Monday in a month that only has 4).
*/
const nthWeekdayOfMonth = (
year: number,
month: number,
targetDay: number,
n: number,
): Date | null => {
if (n > 0) {
const first = new Date(year, month, 1)
const diff = (targetDay - first.getDay() + 7) % 7
const day = 1 + diff + (n - 1) * 7
const daysInMonth = new Date(year, month + 1, 0).getDate()
if (day > daysInMonth) return null
return new Date(year, month, day)
}
// last occurrence of targetDay in month
const lastDay = new Date(year, month + 1, 0)
const diff = (lastDay.getDay() - targetDay + 7) % 7
return new Date(year, month, lastDay.getDate() - diff)
}
/**
* Return every calendar date (time-of-day at local midnight) on which
* this schedule entry should fire, within [after, before].
*
* Dates only the caller combines each date with the schedule's
* time-of-day via {@link combineDateAndTime}. Splitting date from time
* makes DST transitions a non-issue: we always build the final Date
* with a local-time constructor.
*/
export const generateOccurrenceDates = (
entry: ScheduleEntry,
after: Date,
before: Date,
): Date[] => {
const targetDay = DAY_INDEX[entry.day]
if (targetDay === undefined) return []
if (before.getTime() < after.getTime()) return []
const dates: Date[] = []
const afterStart = startOfDay(after)
const beforeEnd = startOfDay(before)
switch (entry.frequency) {
case 'weekly': {
let cursor = firstOccurrenceOnOrAfter(afterStart, targetDay)
while (cursor.getTime() <= beforeEnd.getTime()) {
dates.push(cursor)
cursor = addDays(cursor, 7)
}
return dates
}
case 'biweekly': {
if (!entry.biweeklyAnchor) return []
const anchor = startOfDay(new Date(entry.biweeklyAnchor))
if (Number.isNaN(anchor.getTime())) return []
let cursor = firstOccurrenceOnOrAfter(afterStart, targetDay)
// Align cursor with anchor's 2-week parity. Compute the whole-day
// delta using midday to avoid DST rounding pushing it off by one.
const daysFromAnchor = Math.round(
(cursor.getTime() + MS_PER_DAY / 2 - (anchor.getTime() + MS_PER_DAY / 2)) /
MS_PER_DAY,
)
const mod = ((daysFromAnchor % 14) + 14) % 14
if (mod !== 0) cursor = addDays(cursor, 14 - mod)
while (cursor.getTime() <= beforeEnd.getTime()) {
if (cursor.getTime() >= afterStart.getTime()) dates.push(cursor)
cursor = addDays(cursor, 14)
}
return dates
}
case 'monthlyByWeekday': {
if (!entry.weekOfMonth) return []
const n = WEEK_OF_MONTH_N[entry.weekOfMonth]
let year = afterStart.getFullYear()
let month = afterStart.getMonth()
const endYear = beforeEnd.getFullYear()
const endMonth = beforeEnd.getMonth()
while (year < endYear || (year === endYear && month <= endMonth)) {
const occ = nthWeekdayOfMonth(year, month, targetDay, n)
if (
occ &&
occ.getTime() >= afterStart.getTime() &&
occ.getTime() <= beforeEnd.getTime()
) {
dates.push(occ)
}
month += 1
if (month > 11) {
month = 0
year += 1
}
}
return dates
}
default:
return []
}
}
/**
* Combine a calendar date (year/month/day from `date`) with a wall-clock
* time taken from `timeSource` (a Date whose local hours/minutes we read).
* Always produced in the server's local timezone, which is what Payload's
* date fields display.
*/
export const combineDateAndTime = (date: Date, timeSource: Date): Date => {
const time = new Date(timeSource)
return new Date(
date.getFullYear(),
date.getMonth(),
date.getDate(),
time.getHours(),
time.getMinutes(),
0,
0,
)
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,19 @@
import { MigrateUpArgs, MigrateDownArgs, sql } from '@payloadcms/db-postgres'
export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
await db.execute(sql`
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:56:17.764Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:56:18.063Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-08T11:56:18.128Z';
ALTER TABLE "pages" ALTER COLUMN "slug" SET DEFAULT '';
ALTER TABLE "_pages_v" ALTER COLUMN "version_slug" SET DEFAULT '';`)
}
export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
await db.execute(sql`
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-03-22T22:44:18.451Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-03-22T22:44:18.743Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-04-18T21:44:18.801Z';
ALTER TABLE "pages" ALTER COLUMN "slug" DROP DEFAULT;
ALTER TABLE "_pages_v" ALTER COLUMN "version_slug" DROP DEFAULT;`)
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,106 @@
import { MigrateUpArgs, MigrateDownArgs, sql } from '@payloadcms/db-postgres'
export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
await db.execute(sql`
CREATE TYPE "public"."enum_church_recurring_schedule_type" AS ENUM('MASS', 'FAMILY', 'WORD');
CREATE TYPE "public"."enum_church_recurring_schedule_frequency" AS ENUM('weekly', 'biweekly', 'monthlyByWeekday');
CREATE TYPE "public"."enum_church_recurring_schedule_day" AS ENUM('monday', 'tuesday', 'wednesday', 'thursday', 'friday', 'saturday', 'sunday');
CREATE TYPE "public"."enum_church_recurring_schedule_week_of_month" AS ENUM('first', 'second', 'third', 'fourth', 'last');
CREATE TYPE "public"."enum_payload_jobs_log_task_slug" AS ENUM('inline', 'generateRecurringMasses');
CREATE TYPE "public"."enum_payload_jobs_log_state" AS ENUM('failed', 'succeeded');
CREATE TYPE "public"."enum_payload_jobs_task_slug" AS ENUM('inline', 'generateRecurringMasses');
CREATE TABLE "church_recurring_schedule" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"type" "enum_church_recurring_schedule_type" NOT NULL,
"frequency" "enum_church_recurring_schedule_frequency" DEFAULT 'weekly' NOT NULL,
"day" "enum_church_recurring_schedule_day" NOT NULL,
"time" timestamp(3) with time zone NOT NULL,
"week_of_month" "enum_church_recurring_schedule_week_of_month",
"biweekly_anchor" timestamp(3) with time zone,
"default_celebrant" varchar,
"default_title" varchar,
"notes" varchar
);
CREATE TABLE "payload_jobs_log" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"executed_at" timestamp(3) with time zone NOT NULL,
"completed_at" timestamp(3) with time zone NOT NULL,
"task_slug" "enum_payload_jobs_log_task_slug" NOT NULL,
"task_i_d" varchar NOT NULL,
"input" jsonb,
"output" jsonb,
"state" "enum_payload_jobs_log_state" NOT NULL,
"error" jsonb
);
CREATE TABLE "payload_jobs" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"input" jsonb,
"completed_at" timestamp(3) with time zone,
"total_tried" numeric DEFAULT 0,
"has_error" boolean DEFAULT false,
"error" jsonb,
"task_slug" "enum_payload_jobs_task_slug",
"queue" varchar DEFAULT 'default',
"wait_until" timestamp(3) with time zone,
"processing" boolean DEFAULT false,
"meta" jsonb,
"updated_at" timestamp(3) with time zone DEFAULT now() NOT NULL,
"created_at" timestamp(3) with time zone DEFAULT now() NOT NULL
);
CREATE TABLE "payload_jobs_stats" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"stats" jsonb,
"updated_at" timestamp(3) with time zone,
"created_at" timestamp(3) with time zone
);
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T14:31:48.373Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T14:31:48.665Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-08T14:31:48.733Z';
ALTER TABLE "worship" ADD COLUMN "generated" boolean DEFAULT false;
ALTER TABLE "church_recurring_schedule" ADD CONSTRAINT "church_recurring_schedule_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."church"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "payload_jobs_log" ADD CONSTRAINT "payload_jobs_log_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."payload_jobs"("id") ON DELETE cascade ON UPDATE no action;
CREATE INDEX "church_recurring_schedule_order_idx" ON "church_recurring_schedule" USING btree ("_order");
CREATE INDEX "church_recurring_schedule_parent_id_idx" ON "church_recurring_schedule" USING btree ("_parent_id");
CREATE INDEX "payload_jobs_log_order_idx" ON "payload_jobs_log" USING btree ("_order");
CREATE INDEX "payload_jobs_log_parent_id_idx" ON "payload_jobs_log" USING btree ("_parent_id");
CREATE INDEX "payload_jobs_completed_at_idx" ON "payload_jobs" USING btree ("completed_at");
CREATE INDEX "payload_jobs_total_tried_idx" ON "payload_jobs" USING btree ("total_tried");
CREATE INDEX "payload_jobs_has_error_idx" ON "payload_jobs" USING btree ("has_error");
CREATE INDEX "payload_jobs_task_slug_idx" ON "payload_jobs" USING btree ("task_slug");
CREATE INDEX "payload_jobs_queue_idx" ON "payload_jobs" USING btree ("queue");
CREATE INDEX "payload_jobs_wait_until_idx" ON "payload_jobs" USING btree ("wait_until");
CREATE INDEX "payload_jobs_processing_idx" ON "payload_jobs" USING btree ("processing");
CREATE INDEX "payload_jobs_updated_at_idx" ON "payload_jobs" USING btree ("updated_at");
CREATE INDEX "payload_jobs_created_at_idx" ON "payload_jobs" USING btree ("created_at");`)
}
export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
await db.execute(sql`
ALTER TABLE "church_recurring_schedule" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "payload_jobs_log" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "payload_jobs" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "payload_jobs_stats" DISABLE ROW LEVEL SECURITY;
DROP TABLE "church_recurring_schedule" CASCADE;
DROP TABLE "payload_jobs_log" CASCADE;
DROP TABLE "payload_jobs" CASCADE;
DROP TABLE "payload_jobs_stats" CASCADE;
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:56:17.764Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:56:18.063Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-08T11:56:18.128Z';
ALTER TABLE "worship" DROP COLUMN "generated";
DROP TYPE "public"."enum_church_recurring_schedule_type";
DROP TYPE "public"."enum_church_recurring_schedule_frequency";
DROP TYPE "public"."enum_church_recurring_schedule_day";
DROP TYPE "public"."enum_church_recurring_schedule_week_of_month";
DROP TYPE "public"."enum_payload_jobs_log_task_slug";
DROP TYPE "public"."enum_payload_jobs_log_state";
DROP TYPE "public"."enum_payload_jobs_task_slug";`)
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,17 @@
import { MigrateUpArgs, MigrateDownArgs, sql } from '@payloadcms/db-postgres'
export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
await db.execute(sql`
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T08:36:37.826Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T08:36:38.113Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-09T08:36:38.168Z';
ALTER TABLE "church_recurring_schedule" ADD COLUMN "default_description" varchar;`)
}
export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
await db.execute(sql`
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T14:31:48.373Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T14:31:48.665Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-08T14:31:48.733Z';
ALTER TABLE "church_recurring_schedule" DROP COLUMN "default_description";`)
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,295 @@
import { MigrateUpArgs, MigrateDownArgs, sql } from '@payloadcms/db-postgres'
export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
await db.execute(sql`
CREATE TABLE "parish_blocks_image_cards_items" (
"_order" integer NOT NULL,
"_parent_id" varchar NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"title" varchar,
"image_id" uuid,
"custom_link" varchar
);
CREATE TABLE "parish_blocks_image_cards" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"_path" text NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"block_name" varchar
);
CREATE TABLE "_parish_v_blocks_image_cards_items" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"title" varchar,
"image_id" uuid,
"custom_link" varchar,
"_uuid" varchar
);
CREATE TABLE "_parish_v_blocks_image_cards" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"_path" text NOT NULL,
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"_uuid" varchar,
"block_name" varchar
);
CREATE TABLE "blog_blocks_image_cards_items" (
"_order" integer NOT NULL,
"_parent_id" varchar NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"title" varchar,
"image_id" uuid,
"custom_link" varchar
);
CREATE TABLE "blog_blocks_image_cards" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"_path" text NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"block_name" varchar
);
CREATE TABLE "_blog_v_blocks_image_cards_items" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"title" varchar,
"image_id" uuid,
"custom_link" varchar,
"_uuid" varchar
);
CREATE TABLE "_blog_v_blocks_image_cards" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"_path" text NOT NULL,
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"_uuid" varchar,
"block_name" varchar
);
CREATE TABLE "pages_blocks_image_cards_items" (
"_order" integer NOT NULL,
"_parent_id" varchar NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"title" varchar,
"image_id" uuid,
"custom_link" varchar
);
CREATE TABLE "pages_blocks_image_cards" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"_path" text NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"block_name" varchar
);
CREATE TABLE "pages_rels" (
"id" serial PRIMARY KEY NOT NULL,
"order" integer,
"parent_id" uuid NOT NULL,
"path" varchar NOT NULL,
"pages_id" uuid,
"group_id" uuid
);
CREATE TABLE "_pages_v_blocks_image_cards_items" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"title" varchar,
"image_id" uuid,
"custom_link" varchar,
"_uuid" varchar
);
CREATE TABLE "_pages_v_blocks_image_cards" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"_path" text NOT NULL,
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"_uuid" varchar,
"block_name" varchar
);
CREATE TABLE "_pages_v_rels" (
"id" serial PRIMARY KEY NOT NULL,
"order" integer,
"parent_id" uuid NOT NULL,
"path" varchar NOT NULL,
"pages_id" uuid,
"group_id" uuid
);
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:40:57.328Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:40:57.621Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-10T11:40:57.679Z';
ALTER TABLE "parish_rels" ADD COLUMN "pages_id" uuid;
ALTER TABLE "parish_rels" ADD COLUMN "group_id" uuid;
ALTER TABLE "_parish_v_rels" ADD COLUMN "pages_id" uuid;
ALTER TABLE "_parish_v_rels" ADD COLUMN "group_id" uuid;
ALTER TABLE "blog_rels" ADD COLUMN "pages_id" uuid;
ALTER TABLE "blog_rels" ADD COLUMN "group_id" uuid;
ALTER TABLE "_blog_v_rels" ADD COLUMN "pages_id" uuid;
ALTER TABLE "_blog_v_rels" ADD COLUMN "group_id" uuid;
ALTER TABLE "parish_blocks_image_cards_items" ADD CONSTRAINT "parish_blocks_image_cards_items_image_id_media_id_fk" FOREIGN KEY ("image_id") REFERENCES "public"."media"("id") ON DELETE set null ON UPDATE no action;
ALTER TABLE "parish_blocks_image_cards_items" ADD CONSTRAINT "parish_blocks_image_cards_items_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."parish_blocks_image_cards"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "parish_blocks_image_cards" ADD CONSTRAINT "parish_blocks_image_cards_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."parish"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_parish_v_blocks_image_cards_items" ADD CONSTRAINT "_parish_v_blocks_image_cards_items_image_id_media_id_fk" FOREIGN KEY ("image_id") REFERENCES "public"."media"("id") ON DELETE set null ON UPDATE no action;
ALTER TABLE "_parish_v_blocks_image_cards_items" ADD CONSTRAINT "_parish_v_blocks_image_cards_items_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."_parish_v_blocks_image_cards"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_parish_v_blocks_image_cards" ADD CONSTRAINT "_parish_v_blocks_image_cards_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."_parish_v"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "blog_blocks_image_cards_items" ADD CONSTRAINT "blog_blocks_image_cards_items_image_id_media_id_fk" FOREIGN KEY ("image_id") REFERENCES "public"."media"("id") ON DELETE set null ON UPDATE no action;
ALTER TABLE "blog_blocks_image_cards_items" ADD CONSTRAINT "blog_blocks_image_cards_items_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."blog_blocks_image_cards"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "blog_blocks_image_cards" ADD CONSTRAINT "blog_blocks_image_cards_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."blog"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_blog_v_blocks_image_cards_items" ADD CONSTRAINT "_blog_v_blocks_image_cards_items_image_id_media_id_fk" FOREIGN KEY ("image_id") REFERENCES "public"."media"("id") ON DELETE set null ON UPDATE no action;
ALTER TABLE "_blog_v_blocks_image_cards_items" ADD CONSTRAINT "_blog_v_blocks_image_cards_items_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."_blog_v_blocks_image_cards"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_blog_v_blocks_image_cards" ADD CONSTRAINT "_blog_v_blocks_image_cards_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."_blog_v"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "pages_blocks_image_cards_items" ADD CONSTRAINT "pages_blocks_image_cards_items_image_id_media_id_fk" FOREIGN KEY ("image_id") REFERENCES "public"."media"("id") ON DELETE set null ON UPDATE no action;
ALTER TABLE "pages_blocks_image_cards_items" ADD CONSTRAINT "pages_blocks_image_cards_items_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."pages_blocks_image_cards"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "pages_blocks_image_cards" ADD CONSTRAINT "pages_blocks_image_cards_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."pages"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "pages_rels" ADD CONSTRAINT "pages_rels_parent_fk" FOREIGN KEY ("parent_id") REFERENCES "public"."pages"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "pages_rels" ADD CONSTRAINT "pages_rels_pages_fk" FOREIGN KEY ("pages_id") REFERENCES "public"."pages"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "pages_rels" ADD CONSTRAINT "pages_rels_group_fk" FOREIGN KEY ("group_id") REFERENCES "public"."group"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_pages_v_blocks_image_cards_items" ADD CONSTRAINT "_pages_v_blocks_image_cards_items_image_id_media_id_fk" FOREIGN KEY ("image_id") REFERENCES "public"."media"("id") ON DELETE set null ON UPDATE no action;
ALTER TABLE "_pages_v_blocks_image_cards_items" ADD CONSTRAINT "_pages_v_blocks_image_cards_items_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."_pages_v_blocks_image_cards"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_pages_v_blocks_image_cards" ADD CONSTRAINT "_pages_v_blocks_image_cards_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."_pages_v"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_pages_v_rels" ADD CONSTRAINT "_pages_v_rels_parent_fk" FOREIGN KEY ("parent_id") REFERENCES "public"."_pages_v"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_pages_v_rels" ADD CONSTRAINT "_pages_v_rels_pages_fk" FOREIGN KEY ("pages_id") REFERENCES "public"."pages"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_pages_v_rels" ADD CONSTRAINT "_pages_v_rels_group_fk" FOREIGN KEY ("group_id") REFERENCES "public"."group"("id") ON DELETE cascade ON UPDATE no action;
CREATE INDEX "parish_blocks_image_cards_items_order_idx" ON "parish_blocks_image_cards_items" USING btree ("_order");
CREATE INDEX "parish_blocks_image_cards_items_parent_id_idx" ON "parish_blocks_image_cards_items" USING btree ("_parent_id");
CREATE INDEX "parish_blocks_image_cards_items_image_idx" ON "parish_blocks_image_cards_items" USING btree ("image_id");
CREATE INDEX "parish_blocks_image_cards_order_idx" ON "parish_blocks_image_cards" USING btree ("_order");
CREATE INDEX "parish_blocks_image_cards_parent_id_idx" ON "parish_blocks_image_cards" USING btree ("_parent_id");
CREATE INDEX "parish_blocks_image_cards_path_idx" ON "parish_blocks_image_cards" USING btree ("_path");
CREATE INDEX "_parish_v_blocks_image_cards_items_order_idx" ON "_parish_v_blocks_image_cards_items" USING btree ("_order");
CREATE INDEX "_parish_v_blocks_image_cards_items_parent_id_idx" ON "_parish_v_blocks_image_cards_items" USING btree ("_parent_id");
CREATE INDEX "_parish_v_blocks_image_cards_items_image_idx" ON "_parish_v_blocks_image_cards_items" USING btree ("image_id");
CREATE INDEX "_parish_v_blocks_image_cards_order_idx" ON "_parish_v_blocks_image_cards" USING btree ("_order");
CREATE INDEX "_parish_v_blocks_image_cards_parent_id_idx" ON "_parish_v_blocks_image_cards" USING btree ("_parent_id");
CREATE INDEX "_parish_v_blocks_image_cards_path_idx" ON "_parish_v_blocks_image_cards" USING btree ("_path");
CREATE INDEX "blog_blocks_image_cards_items_order_idx" ON "blog_blocks_image_cards_items" USING btree ("_order");
CREATE INDEX "blog_blocks_image_cards_items_parent_id_idx" ON "blog_blocks_image_cards_items" USING btree ("_parent_id");
CREATE INDEX "blog_blocks_image_cards_items_image_idx" ON "blog_blocks_image_cards_items" USING btree ("image_id");
CREATE INDEX "blog_blocks_image_cards_order_idx" ON "blog_blocks_image_cards" USING btree ("_order");
CREATE INDEX "blog_blocks_image_cards_parent_id_idx" ON "blog_blocks_image_cards" USING btree ("_parent_id");
CREATE INDEX "blog_blocks_image_cards_path_idx" ON "blog_blocks_image_cards" USING btree ("_path");
CREATE INDEX "_blog_v_blocks_image_cards_items_order_idx" ON "_blog_v_blocks_image_cards_items" USING btree ("_order");
CREATE INDEX "_blog_v_blocks_image_cards_items_parent_id_idx" ON "_blog_v_blocks_image_cards_items" USING btree ("_parent_id");
CREATE INDEX "_blog_v_blocks_image_cards_items_image_idx" ON "_blog_v_blocks_image_cards_items" USING btree ("image_id");
CREATE INDEX "_blog_v_blocks_image_cards_order_idx" ON "_blog_v_blocks_image_cards" USING btree ("_order");
CREATE INDEX "_blog_v_blocks_image_cards_parent_id_idx" ON "_blog_v_blocks_image_cards" USING btree ("_parent_id");
CREATE INDEX "_blog_v_blocks_image_cards_path_idx" ON "_blog_v_blocks_image_cards" USING btree ("_path");
CREATE INDEX "pages_blocks_image_cards_items_order_idx" ON "pages_blocks_image_cards_items" USING btree ("_order");
CREATE INDEX "pages_blocks_image_cards_items_parent_id_idx" ON "pages_blocks_image_cards_items" USING btree ("_parent_id");
CREATE INDEX "pages_blocks_image_cards_items_image_idx" ON "pages_blocks_image_cards_items" USING btree ("image_id");
CREATE INDEX "pages_blocks_image_cards_order_idx" ON "pages_blocks_image_cards" USING btree ("_order");
CREATE INDEX "pages_blocks_image_cards_parent_id_idx" ON "pages_blocks_image_cards" USING btree ("_parent_id");
CREATE INDEX "pages_blocks_image_cards_path_idx" ON "pages_blocks_image_cards" USING btree ("_path");
CREATE INDEX "pages_rels_order_idx" ON "pages_rels" USING btree ("order");
CREATE INDEX "pages_rels_parent_idx" ON "pages_rels" USING btree ("parent_id");
CREATE INDEX "pages_rels_path_idx" ON "pages_rels" USING btree ("path");
CREATE INDEX "pages_rels_pages_id_idx" ON "pages_rels" USING btree ("pages_id");
CREATE INDEX "pages_rels_group_id_idx" ON "pages_rels" USING btree ("group_id");
CREATE INDEX "_pages_v_blocks_image_cards_items_order_idx" ON "_pages_v_blocks_image_cards_items" USING btree ("_order");
CREATE INDEX "_pages_v_blocks_image_cards_items_parent_id_idx" ON "_pages_v_blocks_image_cards_items" USING btree ("_parent_id");
CREATE INDEX "_pages_v_blocks_image_cards_items_image_idx" ON "_pages_v_blocks_image_cards_items" USING btree ("image_id");
CREATE INDEX "_pages_v_blocks_image_cards_order_idx" ON "_pages_v_blocks_image_cards" USING btree ("_order");
CREATE INDEX "_pages_v_blocks_image_cards_parent_id_idx" ON "_pages_v_blocks_image_cards" USING btree ("_parent_id");
CREATE INDEX "_pages_v_blocks_image_cards_path_idx" ON "_pages_v_blocks_image_cards" USING btree ("_path");
CREATE INDEX "_pages_v_rels_order_idx" ON "_pages_v_rels" USING btree ("order");
CREATE INDEX "_pages_v_rels_parent_idx" ON "_pages_v_rels" USING btree ("parent_id");
CREATE INDEX "_pages_v_rels_path_idx" ON "_pages_v_rels" USING btree ("path");
CREATE INDEX "_pages_v_rels_pages_id_idx" ON "_pages_v_rels" USING btree ("pages_id");
CREATE INDEX "_pages_v_rels_group_id_idx" ON "_pages_v_rels" USING btree ("group_id");
ALTER TABLE "parish_rels" ADD CONSTRAINT "parish_rels_pages_fk" FOREIGN KEY ("pages_id") REFERENCES "public"."pages"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "parish_rels" ADD CONSTRAINT "parish_rels_group_fk" FOREIGN KEY ("group_id") REFERENCES "public"."group"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_parish_v_rels" ADD CONSTRAINT "_parish_v_rels_pages_fk" FOREIGN KEY ("pages_id") REFERENCES "public"."pages"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_parish_v_rels" ADD CONSTRAINT "_parish_v_rels_group_fk" FOREIGN KEY ("group_id") REFERENCES "public"."group"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "blog_rels" ADD CONSTRAINT "blog_rels_pages_fk" FOREIGN KEY ("pages_id") REFERENCES "public"."pages"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "blog_rels" ADD CONSTRAINT "blog_rels_group_fk" FOREIGN KEY ("group_id") REFERENCES "public"."group"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_blog_v_rels" ADD CONSTRAINT "_blog_v_rels_pages_fk" FOREIGN KEY ("pages_id") REFERENCES "public"."pages"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_blog_v_rels" ADD CONSTRAINT "_blog_v_rels_group_fk" FOREIGN KEY ("group_id") REFERENCES "public"."group"("id") ON DELETE cascade ON UPDATE no action;
CREATE INDEX "parish_rels_pages_id_idx" ON "parish_rels" USING btree ("pages_id");
CREATE INDEX "parish_rels_group_id_idx" ON "parish_rels" USING btree ("group_id");
CREATE INDEX "_parish_v_rels_pages_id_idx" ON "_parish_v_rels" USING btree ("pages_id");
CREATE INDEX "_parish_v_rels_group_id_idx" ON "_parish_v_rels" USING btree ("group_id");
CREATE INDEX "blog_rels_pages_id_idx" ON "blog_rels" USING btree ("pages_id");
CREATE INDEX "blog_rels_group_id_idx" ON "blog_rels" USING btree ("group_id");
CREATE INDEX "_blog_v_rels_pages_id_idx" ON "_blog_v_rels" USING btree ("pages_id");
CREATE INDEX "_blog_v_rels_group_id_idx" ON "_blog_v_rels" USING btree ("group_id");`)
}
export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
await db.execute(sql`
ALTER TABLE "parish_blocks_image_cards_items" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "parish_blocks_image_cards" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "_parish_v_blocks_image_cards_items" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "_parish_v_blocks_image_cards" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "blog_blocks_image_cards_items" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "blog_blocks_image_cards" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "_blog_v_blocks_image_cards_items" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "_blog_v_blocks_image_cards" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "pages_blocks_image_cards_items" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "pages_blocks_image_cards" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "pages_rels" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "_pages_v_blocks_image_cards_items" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "_pages_v_blocks_image_cards" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "_pages_v_rels" DISABLE ROW LEVEL SECURITY;
DROP TABLE "parish_blocks_image_cards_items" CASCADE;
DROP TABLE "parish_blocks_image_cards" CASCADE;
DROP TABLE "_parish_v_blocks_image_cards_items" CASCADE;
DROP TABLE "_parish_v_blocks_image_cards" CASCADE;
DROP TABLE "blog_blocks_image_cards_items" CASCADE;
DROP TABLE "blog_blocks_image_cards" CASCADE;
DROP TABLE "_blog_v_blocks_image_cards_items" CASCADE;
DROP TABLE "_blog_v_blocks_image_cards" CASCADE;
DROP TABLE "pages_blocks_image_cards_items" CASCADE;
DROP TABLE "pages_blocks_image_cards" CASCADE;
DROP TABLE "pages_rels" CASCADE;
DROP TABLE "_pages_v_blocks_image_cards_items" CASCADE;
DROP TABLE "_pages_v_blocks_image_cards" CASCADE;
DROP TABLE "_pages_v_rels" CASCADE;
ALTER TABLE "parish_rels" DROP CONSTRAINT "parish_rels_pages_fk";
ALTER TABLE "parish_rels" DROP CONSTRAINT "parish_rels_group_fk";
ALTER TABLE "_parish_v_rels" DROP CONSTRAINT "_parish_v_rels_pages_fk";
ALTER TABLE "_parish_v_rels" DROP CONSTRAINT "_parish_v_rels_group_fk";
ALTER TABLE "blog_rels" DROP CONSTRAINT "blog_rels_pages_fk";
ALTER TABLE "blog_rels" DROP CONSTRAINT "blog_rels_group_fk";
ALTER TABLE "_blog_v_rels" DROP CONSTRAINT "_blog_v_rels_pages_fk";
ALTER TABLE "_blog_v_rels" DROP CONSTRAINT "_blog_v_rels_group_fk";
DROP INDEX "parish_rels_pages_id_idx";
DROP INDEX "parish_rels_group_id_idx";
DROP INDEX "_parish_v_rels_pages_id_idx";
DROP INDEX "_parish_v_rels_group_id_idx";
DROP INDEX "blog_rels_pages_id_idx";
DROP INDEX "blog_rels_group_id_idx";
DROP INDEX "_blog_v_rels_pages_id_idx";
DROP INDEX "_blog_v_rels_group_id_idx";
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T08:36:37.826Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T08:36:38.113Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-09T08:36:38.168Z';
ALTER TABLE "parish_rels" DROP COLUMN "pages_id";
ALTER TABLE "parish_rels" DROP COLUMN "group_id";
ALTER TABLE "_parish_v_rels" DROP COLUMN "pages_id";
ALTER TABLE "_parish_v_rels" DROP COLUMN "group_id";
ALTER TABLE "blog_rels" DROP COLUMN "pages_id";
ALTER TABLE "blog_rels" DROP COLUMN "group_id";
ALTER TABLE "_blog_v_rels" DROP COLUMN "pages_id";
ALTER TABLE "_blog_v_rels" DROP COLUMN "group_id";`)
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,60 @@
import { MigrateUpArgs, MigrateDownArgs, sql } from '@payloadcms/db-postgres'
export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
await db.execute(sql`
CREATE TYPE "public"."enum_parish_blocks_title_size" AS ENUM('xl', 'lg', 'md', 'sm');
CREATE TYPE "public"."enum_parish_blocks_title_align" AS ENUM('left', 'center');
CREATE TYPE "public"."enum__parish_v_blocks_title_size" AS ENUM('xl', 'lg', 'md', 'sm');
CREATE TYPE "public"."enum__parish_v_blocks_title_align" AS ENUM('left', 'center');
CREATE TABLE "parish_blocks_title" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"_path" text NOT NULL,
"id" varchar PRIMARY KEY NOT NULL,
"title" varchar,
"subtitle" varchar,
"size" "enum_parish_blocks_title_size" DEFAULT 'lg',
"align" "enum_parish_blocks_title_align" DEFAULT 'left',
"block_name" varchar
);
CREATE TABLE "_parish_v_blocks_title" (
"_order" integer NOT NULL,
"_parent_id" uuid NOT NULL,
"_path" text NOT NULL,
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"title" varchar,
"subtitle" varchar,
"size" "enum__parish_v_blocks_title_size" DEFAULT 'lg',
"align" "enum__parish_v_blocks_title_align" DEFAULT 'left',
"_uuid" varchar,
"block_name" varchar
);
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:52:47.682Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:52:47.973Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-10T11:52:48.026Z';
ALTER TABLE "parish_blocks_title" ADD CONSTRAINT "parish_blocks_title_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."parish"("id") ON DELETE cascade ON UPDATE no action;
ALTER TABLE "_parish_v_blocks_title" ADD CONSTRAINT "_parish_v_blocks_title_parent_id_fk" FOREIGN KEY ("_parent_id") REFERENCES "public"."_parish_v"("id") ON DELETE cascade ON UPDATE no action;
CREATE INDEX "parish_blocks_title_order_idx" ON "parish_blocks_title" USING btree ("_order");
CREATE INDEX "parish_blocks_title_parent_id_idx" ON "parish_blocks_title" USING btree ("_parent_id");
CREATE INDEX "parish_blocks_title_path_idx" ON "parish_blocks_title" USING btree ("_path");
CREATE INDEX "_parish_v_blocks_title_order_idx" ON "_parish_v_blocks_title" USING btree ("_order");
CREATE INDEX "_parish_v_blocks_title_parent_id_idx" ON "_parish_v_blocks_title" USING btree ("_parent_id");
CREATE INDEX "_parish_v_blocks_title_path_idx" ON "_parish_v_blocks_title" USING btree ("_path");`)
}
export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
await db.execute(sql`
ALTER TABLE "parish_blocks_title" DISABLE ROW LEVEL SECURITY;
ALTER TABLE "_parish_v_blocks_title" DISABLE ROW LEVEL SECURITY;
DROP TABLE "parish_blocks_title" CASCADE;
DROP TABLE "_parish_v_blocks_title" CASCADE;
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:40:57.328Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:40:57.621Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-10T11:40:57.679Z';
DROP TYPE "public"."enum_parish_blocks_title_size";
DROP TYPE "public"."enum_parish_blocks_title_align";
DROP TYPE "public"."enum__parish_v_blocks_title_size";
DROP TYPE "public"."enum__parish_v_blocks_title_align";`)
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,31 @@
import { MigrateUpArgs, MigrateDownArgs, sql } from '@payloadcms/db-postgres'
export async function up({ db, payload, req }: MigrateUpArgs): Promise<void> {
await db.execute(sql`
CREATE TYPE "public"."enum_parish_blocks_title_color" AS ENUM('base', 'shade1', 'shade2', 'shade3', 'contrast', 'contrastShade1');
CREATE TYPE "public"."enum__parish_v_blocks_title_color" AS ENUM('base', 'shade1', 'shade2', 'shade3', 'contrast', 'contrastShade1');
CREATE TYPE "public"."enum_pages_blocks_title_color" AS ENUM('base', 'shade1', 'shade2', 'shade3', 'contrast', 'contrastShade1');
CREATE TYPE "public"."enum__pages_v_blocks_title_color" AS ENUM('base', 'shade1', 'shade2', 'shade3', 'contrast', 'contrastShade1');
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T12:46:39.042Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T12:46:39.348Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-10T12:46:39.403Z';
ALTER TABLE "parish_blocks_title" ADD COLUMN "color" "enum_parish_blocks_title_color" DEFAULT 'base';
ALTER TABLE "_parish_v_blocks_title" ADD COLUMN "color" "enum__parish_v_blocks_title_color" DEFAULT 'base';
ALTER TABLE "pages_blocks_title" ADD COLUMN "color" "enum_pages_blocks_title_color" DEFAULT 'base';
ALTER TABLE "_pages_v_blocks_title" ADD COLUMN "color" "enum__pages_v_blocks_title_color" DEFAULT 'base';`)
}
export async function down({ db, payload, req }: MigrateDownArgs): Promise<void> {
await db.execute(sql`
ALTER TABLE "announcement" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:52:47.682Z';
ALTER TABLE "calendar" ALTER COLUMN "date" SET DEFAULT '2026-04-12T11:52:47.973Z';
ALTER TABLE "classifieds" ALTER COLUMN "until" SET DEFAULT '2026-05-10T11:52:48.026Z';
ALTER TABLE "parish_blocks_title" DROP COLUMN "color";
ALTER TABLE "_parish_v_blocks_title" DROP COLUMN "color";
ALTER TABLE "pages_blocks_title" DROP COLUMN "color";
ALTER TABLE "_pages_v_blocks_title" DROP COLUMN "color";
DROP TYPE "public"."enum_parish_blocks_title_color";
DROP TYPE "public"."enum__parish_v_blocks_title_color";
DROP TYPE "public"."enum_pages_blocks_title_color";
DROP TYPE "public"."enum__pages_v_blocks_title_color";`)
}

View file

@ -24,6 +24,12 @@ import * as migration_20260311_110236_live_preview from './20260311_110236_live_
import * as migration_20260319_215840_collaps_item from './20260319_215840_collaps_item'; import * as migration_20260319_215840_collaps_item from './20260319_215840_collaps_item';
import * as migration_20260319_223804_contactperson_block from './20260319_223804_contactperson_block'; import * as migration_20260319_223804_contactperson_block from './20260319_223804_contactperson_block';
import * as migration_20260319_224419 from './20260319_224419'; import * as migration_20260319_224419 from './20260319_224419';
import * as migration_20260408_115618 from './20260408_115618';
import * as migration_20260408_143149 from './20260408_143149';
import * as migration_20260409_083638 from './20260409_083638';
import * as migration_20260410_114057_imagecards_block from './20260410_114057_imagecards_block';
import * as migration_20260410_115248_parish_title_block from './20260410_115248_parish_title_block';
import * as migration_20260410_124639 from './20260410_124639';
export const migrations = [ export const migrations = [
{ {
@ -154,6 +160,36 @@ export const migrations = [
{ {
up: migration_20260319_224419.up, up: migration_20260319_224419.up,
down: migration_20260319_224419.down, down: migration_20260319_224419.down,
name: '20260319_224419' name: '20260319_224419',
},
{
up: migration_20260408_115618.up,
down: migration_20260408_115618.down,
name: '20260408_115618',
},
{
up: migration_20260408_143149.up,
down: migration_20260408_143149.down,
name: '20260408_143149',
},
{
up: migration_20260409_083638.up,
down: migration_20260409_083638.down,
name: '20260409_083638',
},
{
up: migration_20260410_114057_imagecards_block.up,
down: migration_20260410_114057_imagecards_block.down,
name: '20260410_114057_imagecards_block',
},
{
up: migration_20260410_115248_parish_title_block.up,
down: migration_20260410_115248_parish_title_block.down,
name: '20260410_115248_parish_title_block',
},
{
up: migration_20260410_124639.up,
down: migration_20260410_124639.down,
name: '20260410_124639'
}, },
]; ];

View file

@ -71,7 +71,7 @@ export const Worship = ({ worship }: WorshipPageProps) => {
<div className={styles.church}> <div className={styles.church}>
<ChurchIcon <ChurchIcon
church={church(typeof worship.location == "object" ? worship.location.name : "clara")} church={church(typeof worship.location == "object" ? worship.location.name : "clara")}
color={"#426156"} color={"var(--base-color)"}
style={"filled"} style={"filled"}
stroke={3} stroke={3}
/> />

File diff suppressed because it is too large Load diff

View file

@ -43,6 +43,7 @@ import { DonationForms } from '@/collections/DonationForms'
import { Pages } from '@/collections/Pages' import { Pages } from '@/collections/Pages'
import { Prayers } from '@/collections/Prayers' import { Prayers } from '@/collections/Prayers'
import { siteConfig } from '@/config/site' import { siteConfig } from '@/config/site'
import { generateRecurringMassesTask } from '@/jobs/generateRecurringMasses'
const filename = fileURLToPath(import.meta.url) const filename = fileURLToPath(import.meta.url)
const dirname = path.dirname(filename) const dirname = path.dirname(filename)
@ -104,6 +105,27 @@ export default buildConfig({
MenuGlobal, MenuGlobal,
FooterGlobal, FooterGlobal,
], ],
jobs: {
tasks: [generateRecurringMassesTask],
autoRun: [
{
// every 15 minutes (6-field cron, seconds first)
cron: '0 */15 * * * *',
queue: 'default',
limit: 10,
},
],
// show jobs in the admin panel
jobsCollectionOverrides: ({ defaultJobsCollection }) => {
if (!defaultJobsCollection.admin) {
defaultJobsCollection.admin = {}
}
defaultJobsCollection.admin.hidden = process.env.NODE_ENV === 'production'
return defaultJobsCollection
},
shouldAutoRun: () => process.env.NODE_ENV === 'production',
},
graphQL: { graphQL: {
disable: true disable: true
}, },
@ -116,7 +138,21 @@ export default buildConfig({
HeadingFeature({ enabledHeadingSizes: ["h3","h4","h5"]}), HeadingFeature({ enabledHeadingSizes: ["h3","h4","h5"]}),
AlignFeature(), AlignFeature(),
UnorderedListFeature(), UnorderedListFeature(),
LinkFeature(), LinkFeature({
fields: ({ defaultFields }) => [
...defaultFields,
{
name: 'appearance',
type: 'select',
defaultValue: 'link',
label: 'Darstellung',
options: [
{ label: 'Link', value: 'link' },
{ label: 'Button', value: 'button' },
],
},
],
}),
ParagraphFeature(), ParagraphFeature(),
InlineToolbarFeature(), InlineToolbarFeature(),
FixedToolbarFeature() FixedToolbarFeature()

View file

@ -2,7 +2,7 @@
* Convert string to a church * Convert string to a church
* @param s * @param s
*/ */
export const church = (s: string) : "anna" | "christophorus" | "richard" | "eduard" | "clara" | "joseph" | "franziskus" | "antonius" | "marien" | "maria" | "antoniusFalkenberg" | "johannesNepomuk" => { export const church = (s: string) : "anna" | "christophorus" | "richard" | "eduard" | "clara" | "joseph" | "franziskus" | "antonius" | "marien" | "maria" | "antoniusFrankenberg" | "johannesNepomuk" => {
const lower = s.toLowerCase() const lower = s.toLowerCase()
@ -37,8 +37,8 @@ export const church = (s: string) : "anna" | "christophorus" | "richard" | "edua
return "franziskus" return "franziskus"
} }
if (lower.includes("falkenberg")) { if (lower.includes("frankenberg")) {
return "antoniusFalkenberg" return "antoniusFrankenberg"
} }
if (lower.includes("antonius")) { if (lower.includes("antonius")) {