Dokumentation aktualisiert auf quarkus und besser strukturiert
This commit is contained in:
7
.claude/settings.local.json
Normal file
7
.claude/settings.local.json
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
{
|
||||||
|
"permissions": {
|
||||||
|
"allow": [
|
||||||
|
"WebSearch"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
2
agent-backend/.gitignore
vendored
Normal file
2
agent-backend/.gitignore
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
.env
|
||||||
|
target
|
||||||
3
agent-backend/.mvn/wrapper/maven-wrapper.properties
vendored
Normal file
3
agent-backend/.mvn/wrapper/maven-wrapper.properties
vendored
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
wrapperVersion=3.3.4
|
||||||
|
distributionType=only-script
|
||||||
|
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.9.14/apache-maven-3.9.14-bin.zip
|
||||||
3
agent-backend/.vscode/settings.json
vendored
Normal file
3
agent-backend/.vscode/settings.json
vendored
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
{
|
||||||
|
"java.configuration.updateBuildConfiguration": "interactive"
|
||||||
|
}
|
||||||
145
agent-backend/CLAUDE.md
Normal file
145
agent-backend/CLAUDE.md
Normal file
@@ -0,0 +1,145 @@
|
|||||||
|
# CLAUDE.md — Dateieingang Service
|
||||||
|
|
||||||
|
## Projektbeschreibung
|
||||||
|
|
||||||
|
Quarkus-Backend-Service, der einen n8n-Workflow bei Galabau ersetzt.
|
||||||
|
Empfängt einen HTTP-Trigger von der APEX Automation, holt ZIP-Dateien von einem SFTP-Server,
|
||||||
|
entpackt sie, lädt die Dateien in OCI Object Storage hoch und benachrichtigt die Oracle-Datenbank
|
||||||
|
via ORDS.
|
||||||
|
|
||||||
|
## Dokumentation
|
||||||
|
|
||||||
|
Nur bei Bedarf lesen — nicht automatisch in den Kontext laden:
|
||||||
|
|
||||||
|
| Dokument | Inhalt |
|
||||||
|
|---|---|
|
||||||
|
| `SETUP.md` | Installation (WSL, Java 25, Quarkus CLI), Build, Dev-Server, Umgebungsvariablen |
|
||||||
|
| `docs/Architecture.md` | Pipeline-Steps, Service-Mapping, Konfigurationsstruktur |
|
||||||
|
| `docs/Logging.md` | Logging-Stack (LGTM), MDC-Felder, Log-Events pro Step |
|
||||||
|
| `docs/Plan.md` | Vollständiger Architektur- und Migrationsplan |
|
||||||
|
| `docs/SFTP-Integration.md` | SSHJ-Architektur, Host-Key-Verification, Fehlerbehandlung |
|
||||||
|
|
||||||
|
Datenbankspezifische Dokumente: `../database/`
|
||||||
|
|
||||||
|
## Tech-Stack
|
||||||
|
|
||||||
|
| Komponente | Technologie |
|
||||||
|
|---|---|
|
||||||
|
| Framework | Quarkus (Java 25) |
|
||||||
|
| Build | Maven (Maven Wrapper `./mvnw`) |
|
||||||
|
| SFTP | SSHJ (`com.hierynomus:sshj`) |
|
||||||
|
| ZIP-Verarbeitung | Apache Commons Compress |
|
||||||
|
| OCI Object Storage | OCI Java SDK (`oci-java-sdk-objectstorage`) |
|
||||||
|
| ORDS-Client | MicroProfile REST Client |
|
||||||
|
| Fehlertoleranz | SmallRye Fault Tolerance (`@Retry`, `@Timeout`) |
|
||||||
|
| Async | Quarkus Managed Executor (`quarkus-smallrye-context-propagation`) |
|
||||||
|
| Logging | Quarkus + OTLP → Loki/Grafana (Dev: `quarkus-observability-devservices-lgtm`) |
|
||||||
|
| JSON | Jackson |
|
||||||
|
|
||||||
|
## Paketstruktur
|
||||||
|
|
||||||
|
```
|
||||||
|
src/main/java/de/galabau/dateieingang/
|
||||||
|
├── api/ # REST Endpoint (eingehend von APEX Automation)
|
||||||
|
├── sftp/ # SSHJ: list, download, rename
|
||||||
|
├── zip/ # ZIP-Entpacken (Apache Commons Compress)
|
||||||
|
├── oci/ # OCI Object Storage Upload + Marker
|
||||||
|
├── ords/ # Oracle ORDS REST-Client (process_incoming)
|
||||||
|
├── pipeline/ # Orchestrierung des Gesamtworkflows
|
||||||
|
├── model/ # ProcessingContext, FileEntry, DTOs
|
||||||
|
├── config/ # @ConfigMapping Interfaces
|
||||||
|
└── exception/ # Domain-Exceptions
|
||||||
|
```
|
||||||
|
|
||||||
|
## Externe Abhängigkeiten
|
||||||
|
|
||||||
|
| Service | Konfiguration | Zweck |
|
||||||
|
|---|---|---|
|
||||||
|
| SFTP-Server | `galabau.sftp.*` | ZIP-Dateien abholen |
|
||||||
|
| OCI Object Storage | `galabau.oci.*` | Dateien + Marker ablegen |
|
||||||
|
| Oracle ORDS | `galabau.ords.*` | DB-Verarbeitung auslösen |
|
||||||
|
| APEX Automation | ruft `/api/process-incoming` auf | Trigger (stündlich) |
|
||||||
|
|
||||||
|
## Konventionen
|
||||||
|
|
||||||
|
- **Sprache**: Dokumentation auf Deutsch, Code auf Englisch
|
||||||
|
- **Konfiguration**: Alle URLs, Ports, Benutzernamen in `application.properties`; Passwörter und Keys via `${ENV_VAR}` aus Umgebungsvariablen
|
||||||
|
- **Kein eigenes Scheduling**: Trigger kommt von APEX Automation via HTTP POST
|
||||||
|
- **Async**: Pipeline läuft im Hintergrund, APEX bekommt sofort 202 Accepted
|
||||||
|
- **Cleanup**: Lokale Arbeitsdateien immer im `finally`-Block löschen
|
||||||
|
- **Einzelinstanz**: Kein Clustering — `AtomicBoolean`-Guard verhindert gleichzeitige Läufe
|
||||||
|
|
||||||
|
## Java Code Guidelines
|
||||||
|
|
||||||
|
### Sprache
|
||||||
|
|
||||||
|
Dokumentation und Kommentare auf Deutsch, Code (Klassen-, Methoden-, Variablennamen) auf Englisch.
|
||||||
|
|
||||||
|
### Javadoc
|
||||||
|
|
||||||
|
Javadoc für alle öffentlichen Methoden der Service-Klassen und alle Klassen in `model/`.
|
||||||
|
Nicht nötig für triviale Getter, Test-Methoden oder selbsterklärende Signaturen.
|
||||||
|
|
||||||
|
```java
|
||||||
|
// ✅ Javadoc sinnvoll
|
||||||
|
/**
|
||||||
|
* Lädt alle Dateien aus dem ProcessingContext in OCI Object Storage hoch
|
||||||
|
* und schreibt danach den Marker für die DB-Verarbeitung.
|
||||||
|
*
|
||||||
|
* @param context enthält die Liste der hochzuladenden Dateien und den Ziel-Prefix
|
||||||
|
* @throws OciException bei persistenten OCI-Fehlern (4xx)
|
||||||
|
*/
|
||||||
|
public void upload(ProcessingContext context) throws OciException { ... }
|
||||||
|
```
|
||||||
|
|
||||||
|
### Exceptions
|
||||||
|
|
||||||
|
Eigene Domain-Exceptions pro Integrationsbereich. Möglichst nicht `Exception` oder `RuntimeException` fangen.
|
||||||
|
|
||||||
|
```java
|
||||||
|
// ✅
|
||||||
|
throw new SftpException("SFTP rename failed for: " + filename, cause);
|
||||||
|
|
||||||
|
// ❌
|
||||||
|
throw new RuntimeException("something went wrong");
|
||||||
|
```
|
||||||
|
|
||||||
|
### Logging
|
||||||
|
|
||||||
|
`io.quarkus.logging.Log` — statische API, kein Field-Boilerplate.
|
||||||
|
|
||||||
|
```java
|
||||||
|
import io.quarkus.logging.Log;
|
||||||
|
|
||||||
|
@ApplicationScoped
|
||||||
|
public class SftpService {
|
||||||
|
public List<String> listZipFiles() throws SftpException {
|
||||||
|
Log.infof("Listing ZIP files on SFTP: %s", config.remotePath());
|
||||||
|
Log.errorf(e, "SFTP list failed on %s", config.host());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Immer mit MDC-Kontext loggen (wird automatisch angehängt, siehe `docs/Logging.md`).
|
||||||
|
Nie sensible Daten loggen (Passwörter, Keys, Key-Dateipfade).
|
||||||
|
|
||||||
|
### Konfiguration
|
||||||
|
|
||||||
|
Keine hardcodierten Werte — alles via `@ConfigMapping`.
|
||||||
|
|
||||||
|
```java
|
||||||
|
// ✅
|
||||||
|
@ConfigMapping(prefix = "galabau.sftp")
|
||||||
|
public interface SftpConfig { ... }
|
||||||
|
|
||||||
|
// ❌
|
||||||
|
private static final String HOST = "sftp.lieferant.de";
|
||||||
|
```
|
||||||
|
|
||||||
|
### CDI / Services
|
||||||
|
|
||||||
|
Services als `@ApplicationScoped`. Keine manuelle Instanziierung mit `new`.
|
||||||
|
|
||||||
|
### Paketgrenzen
|
||||||
|
|
||||||
|
Kein direkter Zugriff auf interne Klassen fremder Pakete — nur über die öffentliche Service-API.
|
||||||
209
agent-backend/SETUP.md
Normal file
209
agent-backend/SETUP.md
Normal file
@@ -0,0 +1,209 @@
|
|||||||
|
# Setup — Dateieingang Service
|
||||||
|
|
||||||
|
Alle Befehle werden in **WSL** ausgeführt.
|
||||||
|
Maven wird über den enthaltenen Maven Wrapper (`./mvnw`) aufgerufen — kein separates Maven nötig.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Voraussetzungen
|
||||||
|
|
||||||
|
### Java 25 (via SDKMAN)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -s "https://get.sdkman.io" | bash
|
||||||
|
source "$HOME/.sdkman/bin/sdkman-init.sh"
|
||||||
|
sdk install java 25-tem
|
||||||
|
sdk default java 25-tem
|
||||||
|
java -version
|
||||||
|
```
|
||||||
|
|
||||||
|
### Maven Wrapper
|
||||||
|
|
||||||
|
Das Projekt verwendet den Maven Wrapper (`./mvnw`), der bereits im Projekt enthalten ist.
|
||||||
|
Keine separate Maven-Installation notwendig.
|
||||||
|
|
||||||
|
Das initiale Einrichten des Maven Wrappers wurde gemacht, indem (1.) Maven manuell installiert
|
||||||
|
wurde und (2.) das Maven Goal `wrapper` ausgeführt wurde:
|
||||||
|
|
||||||
|
1. `sdk install maven`
|
||||||
|
2. `mvn -N wrapper:wrapper -Dmaven=3.9.14`
|
||||||
|
|
||||||
|
> **Hinweis:** SDKMAN fixiert die Java-Version, `mvnw` fixiert die Maven-Version —
|
||||||
|
> beide zusammen stellen sicher, dass alle Entwickler identische Builds erzeugen.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Projekt bauen
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Bauen (inkl. Tests)
|
||||||
|
./mvnw package
|
||||||
|
|
||||||
|
# Bauen ohne Tests
|
||||||
|
./mvnw package -DskipTests
|
||||||
|
|
||||||
|
# Bauen + Tests ausführen
|
||||||
|
./mvnw verify
|
||||||
|
```
|
||||||
|
|
||||||
|
Build-Artefakt: `target/quarkus-app/`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Dev-Server starten
|
||||||
|
|
||||||
|
### Standard (ohne Observability-Stack)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./mvnw quarkus:dev
|
||||||
|
```
|
||||||
|
|
||||||
|
Quarkus Dev UI erreichbar unter: http://localhost:8080/q/dev/
|
||||||
|
REST Endpoint erreichbar unter: `POST http://localhost:8080/api/process-incoming`
|
||||||
|
|
||||||
|
### Mit Grafana / LGTM-Stack
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./mvnw quarkus:dev -Pgrafana
|
||||||
|
```
|
||||||
|
|
||||||
|
Startet automatisch via Dev Services:
|
||||||
|
- **Loki** — Log-Aggregation
|
||||||
|
- **Grafana** — http://localhost:3000 (admin/admin)
|
||||||
|
- **Tempo** — Distributed Tracing
|
||||||
|
- **Mimir** — Metriken
|
||||||
|
|
||||||
|
Logs werden via OTLP direkt an Loki geschickt — kein Promtail, kein manuelles Setup.
|
||||||
|
|
||||||
|
> **Hinweis:** Das `-Pgrafana`-Profil ist nur für die Entwicklung gedacht.
|
||||||
|
> In Produktion wird ein externer OTLP-Collector (z.B. Grafana Alloy) verwendet.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Umgebungsvariablen
|
||||||
|
|
||||||
|
Für lokale Entwicklung eine `.env`-Datei im Projektverzeichnis anlegen:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# API-Absicherung des REST Endpoints
|
||||||
|
export GALABAU_API_KEY=<lokaler-dev-key>
|
||||||
|
|
||||||
|
# SFTP
|
||||||
|
export GALABAU_SFTP_PASSWORD=<sftp-passwort>
|
||||||
|
# Alternativ (Public-Key-Auth):
|
||||||
|
# export SFTP_KEY_PASSPHRASE=<passphrase>
|
||||||
|
|
||||||
|
# OCI Object Storage Credentials
|
||||||
|
export OCI_TENANCY_ID=ocid1.tenancy.oc1..xxx
|
||||||
|
export OCI_USER_ID=ocid1.user.oc1..xxx
|
||||||
|
export OCI_FINGERPRINT=aa:bb:cc:dd:...
|
||||||
|
# Lokal: Pfad zur eigenen OCI Key-Datei
|
||||||
|
export OCI_PRIVATE_KEY_PATH=~/.oci/oci_api_key.pem
|
||||||
|
# In Produktion (Kubernetes): gemountetes Secret, z.B. /etc/oci/private-key.pem
|
||||||
|
|
||||||
|
# ORDS
|
||||||
|
export GALABAU_ORDS_API_KEY=<ords-api-key>
|
||||||
|
```
|
||||||
|
|
||||||
|
Importieren:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
source .env
|
||||||
|
```
|
||||||
|
|
||||||
|
> **.env niemals committen** — in `.gitignore` eintragen.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Projektstruktur
|
||||||
|
|
||||||
|
```
|
||||||
|
src/main/java/de/galabau/dateieingang/
|
||||||
|
├── api/ # FileProcessingResource — REST Endpoint (/api/process-incoming)
|
||||||
|
├── sftp/ # SftpService, SftpConfig — SSHJ: list, download, rename
|
||||||
|
├── zip/ # ZipExtractionService — Apache Commons Compress
|
||||||
|
├── oci/ # OciUploadService, OciConfig — OCI SDK, Marker-Handling
|
||||||
|
├── ords/ # OrdsClient, OrdsNotificationService — MicroProfile REST Client
|
||||||
|
├── pipeline/ # FileProcessingPipeline — Orchestrierung + Async
|
||||||
|
├── model/ # ProcessingContext, FileEntry, OrdsRequest
|
||||||
|
├── config/ # ApplicationConfig
|
||||||
|
└── exception/ # SftpException, ZipException, OciException, OrdsException
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Endpoint manuell aufrufen (Entwicklung)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -X POST http://localhost:8080/api/process-incoming \
|
||||||
|
-H "X-Api-Key: $GALABAU_API_KEY"
|
||||||
|
```
|
||||||
|
|
||||||
|
Erwartete Antwort: `HTTP 202 Accepted` (Pipeline läuft im Hintergrund)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Nützliche Maven-Befehle
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Extension hinzufügen
|
||||||
|
./mvnw quarkus:add-extension -Dextensions="smallrye-fault-tolerance"
|
||||||
|
|
||||||
|
# Installierte Extensions anzeigen
|
||||||
|
./mvnw quarkus:list-extensions
|
||||||
|
|
||||||
|
# Dependency-Baum anzeigen
|
||||||
|
./mvnw dependency:tree
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Testen
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Unit-Tests
|
||||||
|
./mvnw test
|
||||||
|
|
||||||
|
# Integrationstests (benötigt laufende externe Services / Docker für Testcontainers)
|
||||||
|
./mvnw verify
|
||||||
|
```
|
||||||
|
|
||||||
|
Für SFTP-Integrationstests wird Testcontainers mit `atmoz/sftp` verwendet —
|
||||||
|
Docker muss lokal laufen.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Debugging
|
||||||
|
|
||||||
|
### Mit IntelliJ IDEA
|
||||||
|
|
||||||
|
1. Projekt öffnen
|
||||||
|
2. Java 25 als Projekt-SDK wählen
|
||||||
|
3. Quarkus-Plugin installieren (falls nicht vorhanden)
|
||||||
|
4. Dev-Server über `quarkus:dev` starten — Quarkus unterstützt Hot-Reload
|
||||||
|
|
||||||
|
### Mit VS Code
|
||||||
|
|
||||||
|
1. Quarkus-Plugin für VS Code installieren
|
||||||
|
2. Projektverzeichnis öffnen
|
||||||
|
3. Integrierte Debug-Funktionalität nutzen
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Deployment (Kubernetes / OCI)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# JAR bauen
|
||||||
|
./mvnw package -DskipTests
|
||||||
|
|
||||||
|
# Container-Image bauen (Quarkus JVM-Mode)
|
||||||
|
./mvnw package -Dquarkus.container-image.build=true
|
||||||
|
```
|
||||||
|
|
||||||
|
Kubernetes Secrets für Credentials anlegen:
|
||||||
|
- Skalare Werte (`OCI_TENANCY_ID`, `OCI_USER_ID`, `OCI_FINGERPRINT`, `GALABAU_API_KEY`, etc.)
|
||||||
|
→ Kubernetes Secret → Env-Vars
|
||||||
|
- OCI Private Key (PEM-Datei)
|
||||||
|
→ Kubernetes Secret → Volume Mount unter `/etc/oci/private-key.pem`
|
||||||
|
|
||||||
|
Siehe `docs/Plan.md` — Abschnitt "Deployment-Constraints".
|
||||||
166
agent-backend/docs/Architecture.md
Normal file
166
agent-backend/docs/Architecture.md
Normal file
@@ -0,0 +1,166 @@
|
|||||||
|
# Architektur — Dateieingang Service
|
||||||
|
|
||||||
|
## Überblick
|
||||||
|
|
||||||
|
Der Service empfängt einen HTTP-Trigger von der APEX Automation, verbindet sich mit einem
|
||||||
|
SFTP-Server, lädt neue ZIP-Dateien herunter, entpackt sie und lädt den Inhalt in OCI Object
|
||||||
|
Storage hoch. Anschließend wird der ORDS-Endpunkt `pck_auto_import.p_process_incoming_files`
|
||||||
|
aufgerufen, der die DB-Verarbeitung anstößt. Das Scheduling bleibt bei APEX — der Service
|
||||||
|
selbst hat keinen eigenen Scheduler.
|
||||||
|
|
||||||
|
## Datenfluss
|
||||||
|
|
||||||
|
```
|
||||||
|
APEX Automation (stündlich)
|
||||||
|
│
|
||||||
|
▼ HTTP POST /api/process-incoming (Header: X-Api-Key)
|
||||||
|
FileProcessingResource
|
||||||
|
│ 202 Accepted (sofort)
|
||||||
|
▼
|
||||||
|
FileProcessingPipeline [ManagedExecutor — Hintergrund-Thread]
|
||||||
|
│
|
||||||
|
│ für jede *.zip auf dem SFTP-Server:
|
||||||
|
│
|
||||||
|
├─→ SftpService.listZipFiles() [SSHJ]
|
||||||
|
├─→ SftpService.download() [SSHJ]
|
||||||
|
│
|
||||||
|
├─→ ZipExtractionService.extract() [Apache Commons Compress]
|
||||||
|
│ └─ ProcessingContext mit List<FileEntry>
|
||||||
|
│
|
||||||
|
├─→ OciUploadService.upload() [OCI SDK]
|
||||||
|
│ └─ Dateien in eingang/<zip-name>/ + Marker
|
||||||
|
│
|
||||||
|
├─→ SftpService.renameRemote() [SSHJ]
|
||||||
|
│ └─ .processed (Erfolg) oder .error (Fehler)
|
||||||
|
│
|
||||||
|
├─→ OrdsNotificationService.notify() [MicroProfile REST Client]
|
||||||
|
│ └─ POST pck_auto_import.p_process_incoming_files
|
||||||
|
│
|
||||||
|
└─→ Cleanup: lokale Dateien löschen [immer, im finally]
|
||||||
|
│
|
||||||
|
▼
|
||||||
|
Oracle DB (pck_auto_import verarbeitet eingang/<zip-name>/)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Pipeline-Steps
|
||||||
|
|
||||||
|
| Step-Name | Klasse | Paket | Bibliothek |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `api-trigger` | `FileProcessingResource` | `api` | Quarkus REST |
|
||||||
|
| `sftp-list` | `SftpService` | `sftp` | SSHJ |
|
||||||
|
| `sftp-download` | `SftpService` | `sftp` | SSHJ |
|
||||||
|
| `zip-extract` | `ZipExtractionService` | `zip` | Apache Commons Compress |
|
||||||
|
| `oci-upload` | `OciUploadService` | `oci` | OCI Java SDK |
|
||||||
|
| `sftp-rename` | `SftpService` | `sftp` | SSHJ |
|
||||||
|
| `ords-notify` | `OrdsNotificationService` | `ords` | MicroProfile REST Client |
|
||||||
|
| `cleanup` | `FileProcessingPipeline` | `pipeline` | pure Java |
|
||||||
|
|
||||||
|
## Orchestrierung
|
||||||
|
|
||||||
|
`FileProcessingPipeline` ist der einzige Orchestrierer — kein Framework-Routing.
|
||||||
|
Die Pipeline läuft in einem `ManagedExecutor`-Thread (Quarkus Context Propagation),
|
||||||
|
damit MDC-Kontext und CDI-Scope im Hintergrund-Thread verfügbar bleiben.
|
||||||
|
|
||||||
|
```
|
||||||
|
FileProcessingResource [REST-Thread]
|
||||||
|
│ 202 sofort
|
||||||
|
▼
|
||||||
|
FileProcessingPipeline [ManagedExecutor-Thread]
|
||||||
|
│ AtomicBoolean verhindert Doppelläufe
|
||||||
|
│
|
||||||
|
└─ processAll()
|
||||||
|
└─ für jede ZIP: try { ... } finally { cleanup(); MDC.clear(); }
|
||||||
|
```
|
||||||
|
|
||||||
|
Der `AtomicBoolean isRunning`-Guard stellt sicher, dass ein zweiter APEX-Trigger während
|
||||||
|
eines laufenden Durchgangs mit `409 Conflict` abgewiesen wird, statt einen zweiten Lauf zu starten.
|
||||||
|
|
||||||
|
## Konfiguration (`application.properties`)
|
||||||
|
|
||||||
|
```properties
|
||||||
|
# API-Absicherung
|
||||||
|
galabau.api.key=${GALABAU_API_KEY}
|
||||||
|
|
||||||
|
# SFTP
|
||||||
|
galabau.sftp.host=sftp.lieferant.de
|
||||||
|
galabau.sftp.port=22
|
||||||
|
galabau.sftp.username=sftp_user
|
||||||
|
galabau.sftp.password=${GALABAU_SFTP_PASSWORD}
|
||||||
|
galabau.sftp.host-key-fingerprint=SHA256:... # ssh-keyscan host | ssh-keygen -lf -
|
||||||
|
galabau.sftp.remote-path=/outgoing
|
||||||
|
galabau.sftp.local-work-dir=/tmp/sftp-work
|
||||||
|
# galabau.sftp.private-key-path=/etc/secrets/sftp-key
|
||||||
|
# galabau.sftp.private-key-passphrase=${SFTP_KEY_PASSPHRASE}
|
||||||
|
|
||||||
|
# OCI Object Storage
|
||||||
|
galabau.oci.namespace=mycompany
|
||||||
|
galabau.oci.region=eu-frankfurt-1
|
||||||
|
galabau.oci.bucket=my-bucket
|
||||||
|
galabau.oci.tenant-prefix=mandant_42/
|
||||||
|
galabau.oci.incoming-prefix=eingang/
|
||||||
|
galabau.oci.tenancy-id=${OCI_TENANCY_ID}
|
||||||
|
galabau.oci.user-id=${OCI_USER_ID}
|
||||||
|
galabau.oci.fingerprint=${OCI_FINGERPRINT}
|
||||||
|
galabau.oci.private-key-path=${OCI_PRIVATE_KEY_PATH}
|
||||||
|
|
||||||
|
# ORDS
|
||||||
|
galabau.ords.base-url=http://ords:8080
|
||||||
|
galabau.ords.process-incoming-path=/ords/.../auto_import/process_incoming
|
||||||
|
galabau.ords.api-key=${GALABAU_ORDS_API_KEY}
|
||||||
|
quarkus.rest-client.ords-client.url=${galabau.ords.base-url}
|
||||||
|
|
||||||
|
# Retry
|
||||||
|
galabau.retry.max-attempts=3
|
||||||
|
galabau.retry.backoff-ms=1000
|
||||||
|
```
|
||||||
|
|
||||||
|
## Fehlerbehandlung
|
||||||
|
|
||||||
|
| Fehler | Typ | Handling |
|
||||||
|
|---|---|---|
|
||||||
|
| SFTP-Verbindung fehlgeschlagen | transient | Pipeline bricht ab, nächster APEX-Lauf (1h) |
|
||||||
|
| ZIP beschädigt | persistent | SFTP-Rename zu `.error`, Log, weiter mit nächster ZIP |
|
||||||
|
| OCI 503 | transient | `@Retry` (3x mit Backoff) |
|
||||||
|
| OCI 403 Auth Failed | persistent | SFTP-Rename zu `.error`, Log — Credentials prüfen! |
|
||||||
|
| ORDS-Aufruf fehlgeschlagen | transient | Marker liegt vor, APEX findet ihn nächsten Lauf |
|
||||||
|
|
||||||
|
## OCI-Authentifizierung
|
||||||
|
|
||||||
|
`SimpleAuthenticationDetailsProvider` mit Credentials aus Umgebungsvariablen (skalare Werte)
|
||||||
|
und gemounteter PEM-Datei (Private Key via Kubernetes Secret Volume).
|
||||||
|
|
||||||
|
```java
|
||||||
|
SimpleAuthenticationDetailsProvider auth =
|
||||||
|
SimpleAuthenticationDetailsProvider.builder()
|
||||||
|
.tenantId(config.tenancyId())
|
||||||
|
.userId(config.userId())
|
||||||
|
.fingerprint(config.fingerprint())
|
||||||
|
.region(Region.fromRegionId(config.region()))
|
||||||
|
.privateKeySupplier(new FilePrivateKeySupplier(config.privateKeyPath()))
|
||||||
|
.build();
|
||||||
|
```
|
||||||
|
|
||||||
|
## Datenmodell
|
||||||
|
|
||||||
|
### `ProcessingContext`
|
||||||
|
|
||||||
|
| Feld | Typ | Beschreibung |
|
||||||
|
|---|---|---|
|
||||||
|
| `runId` | `UUID` | Eindeutige ID dieses Verarbeitungslaufs (MDC-Kontext) |
|
||||||
|
| `zipFilename` | `String` | z.B. `export_2026-04-08.zip` |
|
||||||
|
| `zipNameWithoutExt` | `String` | z.B. `export_2026-04-08` |
|
||||||
|
| `localZipPath` | `Path` | Lokale ZIP-Datei (für Cleanup) |
|
||||||
|
| `localExtractDir` | `Path` | Lokales Entpack-Verzeichnis (für Cleanup) |
|
||||||
|
| `extractedFiles` | `List<FileEntry>` | Liste der entpackten Dateien |
|
||||||
|
| `markerUploaded` | `boolean` | Marker in OCI hochgeladen? |
|
||||||
|
| `startTime` | `LocalDateTime` | Startzeitpunkt |
|
||||||
|
| `status` | `ProcessingStatus` | `PENDING / PARTIALLY_UPLOADED / MARKER_UPLOADED / ORDS_NOTIFIED / FAILED` |
|
||||||
|
|
||||||
|
### `FileEntry`
|
||||||
|
|
||||||
|
| Feld | Typ | Beschreibung |
|
||||||
|
|---|---|---|
|
||||||
|
| `relativePath` | `String` | z.B. `subdir/file.csv` |
|
||||||
|
| `ociKey` | `String` | z.B. `eingang/export_2026-04-08/subdir/file.csv` |
|
||||||
|
| `fileSize` | `long` | Dateigröße in Bytes |
|
||||||
|
| `isMarker` | `boolean` | `true` nur für `_READY_FOR_DB_PROCESSING_` |
|
||||||
126
agent-backend/docs/Logging.md
Normal file
126
agent-backend/docs/Logging.md
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
# Logging — Dateieingang Service
|
||||||
|
|
||||||
|
## Stack
|
||||||
|
|
||||||
|
### Entwicklung (Dev Services)
|
||||||
|
|
||||||
|
Quarkus startet den gesamten Observability-Stack automatisch über Dev Services.
|
||||||
|
|
||||||
|
**Maven-Profil** (`pom.xml`):
|
||||||
|
```xml
|
||||||
|
<profile>
|
||||||
|
<id>grafana</id>
|
||||||
|
<dependencies>
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-observability-devservices-lgtm</artifactId>
|
||||||
|
<scope>provided</scope>
|
||||||
|
</dependency>
|
||||||
|
</dependencies>
|
||||||
|
</profile>
|
||||||
|
```
|
||||||
|
|
||||||
|
Mit `./mvnw quarkus:dev -Pgrafana` startet automatisch:
|
||||||
|
- **Loki** — Log-Aggregation
|
||||||
|
- **Grafana** — Visualisierung / Dashboards (http://localhost:3000, admin/admin)
|
||||||
|
- **Tempo** — Distributed Tracing (optional nutzbar)
|
||||||
|
- **Mimir** — Metriken (optional nutzbar)
|
||||||
|
|
||||||
|
**Laufzeit-Dependency** (immer aktiv):
|
||||||
|
```xml
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-opentelemetry</artifactId>
|
||||||
|
</dependency>
|
||||||
|
```
|
||||||
|
|
||||||
|
Quarkus schickt Logs via **OTLP** direkt an den LGTM-Stack — kein Promtail, kein Log-File-Scraping.
|
||||||
|
|
||||||
|
### Produktion
|
||||||
|
|
||||||
|
Einen OTLP-fähigen Collector vor dem produktiven Loki/Grafana-Stack betreiben:
|
||||||
|
- **Grafana Alloy** (Nachfolger von Promtail, OTLP-nativ)
|
||||||
|
- **OpenTelemetry Collector**
|
||||||
|
|
||||||
|
Quarkus-Konfiguration bleibt identisch — nur der OTLP-Endpoint-URL ändert sich.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Log-Hierarchie (MDC-Felder)
|
||||||
|
|
||||||
|
Ein ZIP-Verarbeitungslauf erzeugt Logs auf zwei Ebenen:
|
||||||
|
|
||||||
|
```
|
||||||
|
runId → ein ZIP-Verarbeitungslauf (gesetzt am Anfang jeder ZIP)
|
||||||
|
step → aktueller Pipeline-Step (gesetzt zu Beginn jedes Steps)
|
||||||
|
```
|
||||||
|
|
||||||
|
`zipFilename` wird zusätzlich als menschenlesbarer Kontext geloggt, ist aber kein MDC-Feld
|
||||||
|
(kein Filter-Kriterium, da runId eindeutiger ist).
|
||||||
|
|
||||||
|
### MDC-Felder im Detail
|
||||||
|
|
||||||
|
| Feld | Typ | Beispielwert | Gesetzt in | Gelöscht in |
|
||||||
|
|---|---|---|---|---|
|
||||||
|
| `runId` | UUID | `a1b2c3d4-...` | Beginn ZIP-Verarbeitung | `finally` nach ZIP |
|
||||||
|
| `step` | String | `oci-upload` | Beginn jedes Pipeline-Steps | Ende jedes Steps |
|
||||||
|
|
||||||
|
`MDC.clear()` wird **immer im `finally`-Block** nach jeder ZIP aufgerufen —
|
||||||
|
kein Kontext sickert in die nächste ZIP.
|
||||||
|
|
||||||
|
### Filtermöglichkeiten in Grafana (LogQL)
|
||||||
|
|
||||||
|
```logql
|
||||||
|
# Alle Logs eines Verarbeitungslaufs
|
||||||
|
{service="dateieingang"} | json | runId="a1b2c3d4"
|
||||||
|
|
||||||
|
# Alle Logs eines Pipeline-Steps über alle Läufe
|
||||||
|
{service="dateieingang"} | json | step="oci-upload"
|
||||||
|
|
||||||
|
# Nur Fehler
|
||||||
|
{service="dateieingang"} | json | level="ERROR"
|
||||||
|
|
||||||
|
# Fehler in einem bestimmten Lauf
|
||||||
|
{service="dateieingang"} | json | runId="a1b2c3d4" | level="ERROR"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Loki-Labels** (wenige, statische Werte — keine hohe Kardinalität):
|
||||||
|
- `service` = `dateieingang`
|
||||||
|
- `level` = `INFO` / `WARN` / `ERROR`
|
||||||
|
- `environment` = `dev` / `prod`
|
||||||
|
|
||||||
|
Alle anderen Felder (`runId`, `step`) werden per `| json` im Query-Body gefiltert, nicht als Label.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Log-Events pro Step
|
||||||
|
|
||||||
|
| Step | Level | Event / Inhalt |
|
||||||
|
|---|---|---|
|
||||||
|
| `api-trigger` | INFO | "Processing triggered by APEX, starting async pipeline" |
|
||||||
|
| `api-trigger` | WARN | "Pipeline already running, rejecting trigger (409)" |
|
||||||
|
| `sftp-list` | INFO | "`N` new ZIP files found on SFTP" |
|
||||||
|
| `sftp-list` | INFO | "No new ZIP files found on SFTP" |
|
||||||
|
| `sftp-download` | INFO | "ZIP `<filename>` downloaded (`X` bytes)" |
|
||||||
|
| `zip-extract` | INFO | "ZIP `<filename>` extracted: `N` files, `M` directories" |
|
||||||
|
| `zip-extract` | ERROR | "ZIP `<filename>` is corrupt or unreadable" |
|
||||||
|
| `oci-upload` | INFO | "Uploaded `N` files + marker to OCI in `X`ms" |
|
||||||
|
| `oci-upload` | WARN | "OCI upload attempt `N`/3 failed (503), retrying..." |
|
||||||
|
| `oci-upload` | ERROR | "OCI upload permanently failed (403) — check credentials" |
|
||||||
|
| `sftp-rename` | INFO | "SFTP rename: `<file>` → `<file>.processed`" |
|
||||||
|
| `sftp-rename` | INFO | "SFTP rename: `<file>` → `<file>.error`" |
|
||||||
|
| `ords-notify` | INFO | "ORDS endpoint called, HTTP 200" |
|
||||||
|
| `ords-notify` | WARN | "ORDS call failed (attempt `N`/3), retrying..." |
|
||||||
|
| `ords-notify` | WARN | "ORDS notification failed — marker is present, APEX will pick up next run" |
|
||||||
|
| `cleanup` | DEBUG | "Local files deleted: `<path>`" |
|
||||||
|
| Alle | ERROR | Fehler mit vollem MDC-Kontext, Exception-Message (nie Passwörter/Keys loggen) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Fehlerverhalten (Logging-Perspektive)
|
||||||
|
|
||||||
|
- Fehler werden **immer vor** dem Rename zu `.error` geloggt — damit der Log-Eintrag
|
||||||
|
auch dann vorhanden ist, wenn das Rename selbst fehlschlägt
|
||||||
|
- Der `runId`-Kontext bleibt für den gesamten ZIP-Verarbeitungslauf erhalten —
|
||||||
|
alle Logs einer ZIP sind in Grafana über `runId` filterbar
|
||||||
|
- Retry-Verhalten: siehe `Architecture.md` — Abschnitt Fehlerbehandlung
|
||||||
595
agent-backend/docs/Plan.md
Normal file
595
agent-backend/docs/Plan.md
Normal file
@@ -0,0 +1,595 @@
|
|||||||
|
# Plan: n8n Workflow → Quarkus Backend ersetzen
|
||||||
|
|
||||||
|
**Stand:** 2026-04-08
|
||||||
|
**Basis:** Avise-Tool Quarkus Backend (agent-backend/) — angepasst für SFTP → OCI Object Storage → DB-Workflow
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Ziel
|
||||||
|
|
||||||
|
Ersetze den n8n-Workflow (derzeit: SFTP-Poll → ZIP-Entpacken → OCI-Upload → ORDS-Trigger) durch einen nativen Quarkus-Backend-Service.
|
||||||
|
|
||||||
|
**Vorteil:**
|
||||||
|
- Keine externe Workflow-Engine-Abhängigkeit (n8n)
|
||||||
|
- Single-Responsibility: Ein Service für eine Aufgabe (Dateieingang)
|
||||||
|
- Bessere Observability (OTLP → Loki/Grafana)
|
||||||
|
- Volle Kontrolle über Fehlerbehandlung und Retry-Logik
|
||||||
|
- Einfacheres Deployment (JAR statt n8n-Instanz)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Architektur-Überblick
|
||||||
|
|
||||||
|
Das Scheduling übernimmt weiterhin die APEX Automation (stündlich). Sie ruft den Quarkus-Service
|
||||||
|
per HTTP POST auf — wie bisher den n8n-Webhook. Der Service selbst hat keinen eigenen Scheduler.
|
||||||
|
|
||||||
|
```
|
||||||
|
APEX Automation (stündlich)
|
||||||
|
│
|
||||||
|
▼ HTTP POST /api/process-incoming (Header: X-Api-Key)
|
||||||
|
FileProcessingResource [Quarkus REST Endpoint]
|
||||||
|
│ gibt sofort 202 Accepted zurück (async, fire & forget)
|
||||||
|
▼
|
||||||
|
FileProcessingPipeline [Hintergrund-Thread — für jede neue ZIP]
|
||||||
|
│
|
||||||
|
├─→ SftpService [SSHJ — list, download, rename]
|
||||||
|
│
|
||||||
|
├─→ ZipExtractionService [Apache Commons Compress]
|
||||||
|
│
|
||||||
|
├─→ OciUploadService [OCI SDK — Instance Principal Auth]
|
||||||
|
│
|
||||||
|
└─→ OrdsNotificationService [MicroProfile REST Client]
|
||||||
|
│
|
||||||
|
▼
|
||||||
|
Oracle DB (eingang/, zielordner)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Projektstruktur (Maven)
|
||||||
|
|
||||||
|
```
|
||||||
|
backend-n8n-replacement/
|
||||||
|
├── pom.xml (Java 25, Quarkus 3.x LTS)
|
||||||
|
│
|
||||||
|
├── src/main/java/de/galabau/
|
||||||
|
│ │
|
||||||
|
│ ├── api/
|
||||||
|
│ │ └── FileProcessingResource.java (@Path("/api/process-incoming"), @POST, API-Key-Check)
|
||||||
|
│ │
|
||||||
|
│ ├── sftp/
|
||||||
|
│ │ ├── SftpConfig.java (@ConfigMapping)
|
||||||
|
│ │ └── SftpService.java (SSHJ — list, download, rename)
|
||||||
|
│ │
|
||||||
|
│ ├── zip/
|
||||||
|
│ │ └── ZipExtractionService.java (Apache Commons Compress)
|
||||||
|
│ │
|
||||||
|
│ ├── oci/
|
||||||
|
│ │ ├── OciConfig.java
|
||||||
|
│ │ └── OciUploadService.java (OCI SDK, Instance Principal, Marker-Handling)
|
||||||
|
│ │
|
||||||
|
│ ├── ords/
|
||||||
|
│ │ ├── OrdsClient.java (MicroProfile REST Client)
|
||||||
|
│ │ └── OrdsNotificationService.java (Retry via @Retry annotation)
|
||||||
|
│ │
|
||||||
|
│ ├── pipeline/
|
||||||
|
│ │ └── FileProcessingPipeline.java (Extract → Upload → Notify → Cleanup)
|
||||||
|
│ │
|
||||||
|
│ ├── model/
|
||||||
|
│ │ ├── ProcessingContext.java (runId, Dateien, Status)
|
||||||
|
│ │ └── OrdsRequest.java (DTO)
|
||||||
|
│ │
|
||||||
|
│ ├── config/
|
||||||
|
│ │ └── ApplicationConfig.java (zentrale @ConfigMapping)
|
||||||
|
│ │
|
||||||
|
│ └── exception/
|
||||||
|
│ ├── SftpException.java
|
||||||
|
│ ├── ZipException.java
|
||||||
|
│ ├── OciException.java
|
||||||
|
│ └── OrdsException.java
|
||||||
|
│
|
||||||
|
├── src/main/resources/
|
||||||
|
│ ├── application.properties
|
||||||
|
│ └── application-dev.properties
|
||||||
|
│
|
||||||
|
└── docs/
|
||||||
|
├── SETUP.md
|
||||||
|
├── Architecture.md
|
||||||
|
├── Configuration.md
|
||||||
|
└── ErrorHandling.md
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Pipeline-Steps
|
||||||
|
|
||||||
|
| Step | Komponente | Bibliothek | Beschreibung |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `api-trigger` | `FileProcessingResource` | Quarkus REST | Empfängt POST von APEX Automation, prüft API-Key, startet Pipeline async |
|
||||||
|
| `sftp-list` | `SftpService` | SSHJ | Listet `*.zip`-Dateien im Remote-Verzeichnis |
|
||||||
|
| `sftp-download` | `SftpService` | SSHJ | Lädt ZIP in lokales Arbeitsverzeichnis |
|
||||||
|
| `zip-extract` | `ZipExtractionService` | Apache Commons Compress | Entpackt ZIP, preserviert Ordnerstruktur |
|
||||||
|
| `oci-upload` | `OciUploadService` | OCI SDK | Lädt Dateien + Marker zu OCI Object Storage |
|
||||||
|
| `sftp-rename` | `SftpService` | SSHJ | Remote-Rename zu `.processed` oder `.error` |
|
||||||
|
| `ords-notify` | `OrdsNotificationService` | MicroProfile REST Client | Ruft ORDS-Endpunkt auf |
|
||||||
|
| `cleanup` | `FileProcessingPipeline` | pure Java | Löscht lokale Arbeitsdateien (ZIP + entpackte Dateien) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Konfiguration (`application.properties`)
|
||||||
|
|
||||||
|
```properties
|
||||||
|
# API-Absicherung
|
||||||
|
galabau.api.key=${GALABAU_API_KEY}
|
||||||
|
|
||||||
|
# SFTP
|
||||||
|
galabau.sftp.host=sftp.lieferant.de
|
||||||
|
galabau.sftp.port=22
|
||||||
|
galabau.sftp.username=sftp_user
|
||||||
|
galabau.sftp.password=${GALABAU_SFTP_PASSWORD}
|
||||||
|
galabau.sftp.host-key-fingerprint=SHA256:AbCdEfGh... # ssh-keyscan host | ssh-keygen -lf -
|
||||||
|
galabau.sftp.remote-path=/outgoing
|
||||||
|
galabau.sftp.local-work-dir=/tmp/sftp-work
|
||||||
|
|
||||||
|
# Alternative: Public-Key-Auth (empfohlen für Produktion)
|
||||||
|
# galabau.sftp.private-key-path=/etc/secrets/sftp-key
|
||||||
|
# galabau.sftp.private-key-passphrase=${SFTP_KEY_PASSPHRASE}
|
||||||
|
|
||||||
|
# OCI Object Storage — Auth via Credentials (Key als gemountetes Kubernetes Secret)
|
||||||
|
galabau.oci.namespace=mycompany
|
||||||
|
galabau.oci.region=eu-frankfurt-1
|
||||||
|
galabau.oci.bucket=my-bucket
|
||||||
|
galabau.oci.tenant-prefix=mandant_42/
|
||||||
|
galabau.oci.incoming-prefix=eingang/
|
||||||
|
# Scalar Credentials aus Env-Vars:
|
||||||
|
galabau.oci.tenancy-id=${OCI_TENANCY_ID}
|
||||||
|
galabau.oci.user-id=${OCI_USER_ID}
|
||||||
|
galabau.oci.fingerprint=${OCI_FINGERPRINT}
|
||||||
|
# Private Key als gemountetes Kubernetes Secret (Dateipfad, kein Env-Var-String):
|
||||||
|
galabau.oci.private-key-path=${OCI_PRIVATE_KEY_PATH} # z.B. /etc/oci/private-key.pem
|
||||||
|
|
||||||
|
# ORDS
|
||||||
|
galabau.ords.base-url=http://ords:8080
|
||||||
|
galabau.ords.process-incoming-path=/ords/.../net_storage/process_incoming
|
||||||
|
galabau.ords.api-key=${GALABAU_ORDS_API_KEY}
|
||||||
|
|
||||||
|
# Fehlerbehandlung
|
||||||
|
galabau.retry.max-attempts=3
|
||||||
|
galabau.retry.backoff-ms=1000
|
||||||
|
```
|
||||||
|
|
||||||
|
**Credential-Strategie:**
|
||||||
|
- Alle Passwörter und API-Keys über `${ENV_VAR_NAME}` — Quarkus liest Umgebungsvariablen nativ
|
||||||
|
- OCI-Auth via **Instance Principal** — kein Key-Management, keine Rotation (empfohlen auf OKE)
|
||||||
|
- Deployment: Kubernetes Secrets → Environment Injection für Passwörter/API-Keys
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## REST Endpoint
|
||||||
|
|
||||||
|
```java
|
||||||
|
@Path("/api/process-incoming")
|
||||||
|
@ApplicationScoped
|
||||||
|
public class FileProcessingResource {
|
||||||
|
|
||||||
|
@Inject
|
||||||
|
ApplicationConfig config;
|
||||||
|
|
||||||
|
@Inject
|
||||||
|
FileProcessingPipeline pipeline;
|
||||||
|
|
||||||
|
@POST
|
||||||
|
@Produces(MediaType.APPLICATION_JSON)
|
||||||
|
public Response triggerProcessing(@HeaderParam("X-Api-Key") String apiKey) {
|
||||||
|
if (!config.api().key().equals(apiKey)) {
|
||||||
|
return Response.status(Response.Status.UNAUTHORIZED).build();
|
||||||
|
}
|
||||||
|
// Async: Pipeline im Hintergrund, APEX bekommt sofort 202
|
||||||
|
pipeline.processAllAsync();
|
||||||
|
return Response.accepted().build();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Aufruf durch APEX Automation:**
|
||||||
|
```sql
|
||||||
|
apex_web_service.make_rest_request(
|
||||||
|
p_url => 'http://backend:8080/api/process-incoming',
|
||||||
|
p_http_method => 'POST',
|
||||||
|
p_http_headers => apex_util.string_to_table('X-Api-Key:' || <secret>)
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Concurrency Guard:** `FileProcessingPipeline` hält ein `AtomicBoolean isRunning`-Flag.
|
||||||
|
Kommt ein zweiter Trigger während eine Pipeline läuft → 409 Conflict zurückgeben, kein doppelter Lauf.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Async-Verhalten
|
||||||
|
|
||||||
|
```java
|
||||||
|
@ApplicationScoped
|
||||||
|
public class FileProcessingPipeline {
|
||||||
|
|
||||||
|
private final AtomicBoolean isRunning = new AtomicBoolean(false);
|
||||||
|
|
||||||
|
@Inject
|
||||||
|
ManagedExecutor executor; // Quarkus Managed Executor
|
||||||
|
|
||||||
|
public Response processAllAsync() {
|
||||||
|
if (!isRunning.compareAndSet(false, true)) {
|
||||||
|
return Response.status(409).entity("Pipeline already running").build();
|
||||||
|
}
|
||||||
|
executor.submit(() -> {
|
||||||
|
try {
|
||||||
|
processAll();
|
||||||
|
} finally {
|
||||||
|
isRunning.set(false);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return Response.accepted().build();
|
||||||
|
}
|
||||||
|
|
||||||
|
void processAll() { ... } // eigentliche Logik
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Fehler im Hintergrund-Thread landen im Log (OTLP → Grafana) — APEX bekommt sie
|
||||||
|
nicht als HTTP-Fehler, da die Antwort bereits weg ist. Das entspricht dem bisherigen
|
||||||
|
n8n fire-and-forget-Verhalten.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Fehlerbehandlung & Retry
|
||||||
|
|
||||||
|
### Fehlerklassen
|
||||||
|
|
||||||
|
| Fehler | Typ | Retry | Verhalten |
|
||||||
|
|---|---|---|---|
|
||||||
|
| SFTP-Verbindung fehlgeschlagen | transient | nein | Nächster APEX-Lauf (1h) versucht es |
|
||||||
|
| ZIP beschädigt | persistent | nein | ZIP auf SFTP umbenennen zu `.error`, Log |
|
||||||
|
| OCI-Verbindung fehlgeschlagen (z.B. 503) | transient | ja (exponential backoff) | @Retry |
|
||||||
|
| OCI-Upload einer Datei schlägt fehl | persistent | nein | Bereits hochgeladene löschen, ZIP belassen, Log |
|
||||||
|
| ORDS-Aufruf schlägt fehl | transient | ja (2-3x) | Marker liegt vor → APEX Automation schlägt beim nächsten Lauf ein |
|
||||||
|
| Allgemein technischer Fehler | fallabhängig | siehe SmallRye Fault Tolerance | Exception-Log |
|
||||||
|
|
||||||
|
### Retry-Strategie (SmallRye Fault Tolerance)
|
||||||
|
|
||||||
|
```java
|
||||||
|
@Retry(maxRetries = 3, delay = 1000, delayUnit = ChronoUnit.MILLIS)
|
||||||
|
@Timeout(value = 10, unit = ChronoUnit.SECONDS)
|
||||||
|
public void notifyOrds(ProcessingContext context) { ... }
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Datenmodelle
|
||||||
|
|
||||||
|
### `ProcessingContext`
|
||||||
|
|
||||||
|
```java
|
||||||
|
public class ProcessingContext {
|
||||||
|
public UUID runId; // eindeutige Lauf-ID
|
||||||
|
public String zipFilename; // z.B. "export_2026-04-08.zip"
|
||||||
|
public String zipNameWithoutExt; // z.B. "export_2026-04-08"
|
||||||
|
public Path localZipPath; // lokale ZIP-Datei (für Cleanup)
|
||||||
|
public Path localExtractDir; // lokales Entpack-Verzeichnis (für Cleanup)
|
||||||
|
public List<FileEntry> extractedFiles; // Verzeichnis + Datei-Namen
|
||||||
|
public boolean markerUploaded;
|
||||||
|
public LocalDateTime startTime;
|
||||||
|
public ProcessingStatus status;
|
||||||
|
}
|
||||||
|
|
||||||
|
public enum ProcessingStatus {
|
||||||
|
PENDING, PARTIALLY_UPLOADED, MARKER_UPLOADED, ORDS_NOTIFIED, FAILED
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### `FileEntry`
|
||||||
|
|
||||||
|
```java
|
||||||
|
public class FileEntry {
|
||||||
|
public String relativePath; // "subdir/file.csv"
|
||||||
|
public String ociKey; // "eingang/export_2026-04-08/subdir/file.csv"
|
||||||
|
public long fileSize;
|
||||||
|
public boolean isMarker; // true nur für "_READY_FOR_DB_PROCESSING_"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## SFTP-Verarbeitung mit SSHJ
|
||||||
|
|
||||||
|
Siehe `plan_camel_integration.md` für Details zur SSHJ-Integration (Host-Key-Verification,
|
||||||
|
Credentials, Fehlerbehandlung).
|
||||||
|
|
||||||
|
### Verarbeitungslogik
|
||||||
|
|
||||||
|
```
|
||||||
|
Pipeline.processAll():
|
||||||
|
1. SftpService.listZipFiles() → ["export_2026-04-08.zip", ...]
|
||||||
|
2. für jede ZIP:
|
||||||
|
a. SftpService.download(zip) → lokale Datei
|
||||||
|
b. ZipExtractionService.extract() → ProcessingContext mit FileEntry-Liste
|
||||||
|
c. OciUploadService.upload() → Dateien + Marker in OCI
|
||||||
|
d. SftpService.renameRemote(.processed oder .error)
|
||||||
|
e. OrdsNotificationService.notify()
|
||||||
|
f. cleanup: lokale ZIP + Entpack-Verzeichnis löschen ← immer, auch bei Fehler
|
||||||
|
```
|
||||||
|
|
||||||
|
**Cleanup (Schritt f) läuft immer** — in einem `finally`-Block — damit kein Disk-Vollaufen
|
||||||
|
bei Fehlern oder großen ZIPs.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## OCI-Authentifizierung (SimpleAuthenticationDetailsProvider)
|
||||||
|
|
||||||
|
Credentials kommen aus Umgebungsvariablen (skalare Werte) und einem gemounteten
|
||||||
|
Kubernetes Secret (Private Key als PEM-Datei).
|
||||||
|
|
||||||
|
```java
|
||||||
|
@ApplicationScoped
|
||||||
|
public class OciUploadService {
|
||||||
|
|
||||||
|
private final ObjectStorage client;
|
||||||
|
|
||||||
|
@Inject
|
||||||
|
OciConfig config;
|
||||||
|
|
||||||
|
@PostConstruct
|
||||||
|
void init() {
|
||||||
|
SimpleAuthenticationDetailsProvider auth =
|
||||||
|
SimpleAuthenticationDetailsProvider.builder()
|
||||||
|
.tenantId(config.tenancyId())
|
||||||
|
.userId(config.userId())
|
||||||
|
.fingerprint(config.fingerprint())
|
||||||
|
.region(Region.fromRegionId(config.region()))
|
||||||
|
.privateKeySupplier(new FilePrivateKeySupplier(config.privateKeyPath()))
|
||||||
|
.build();
|
||||||
|
|
||||||
|
this.client = ObjectStorageClient.builder().build(auth);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Kubernetes Deployment:**
|
||||||
|
- Skalare Werte (`tenancyId`, `userId`, `fingerprint`) → Kubernetes Secret → Env-Vars
|
||||||
|
- Private Key PEM-Datei → Kubernetes Secret → Volume Mount unter `/etc/oci/private-key.pem`
|
||||||
|
- `OCI_PRIVATE_KEY_PATH=/etc/oci/private-key.pem` als Env-Var
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ORDS-Integration
|
||||||
|
|
||||||
|
### REST-Client
|
||||||
|
|
||||||
|
```java
|
||||||
|
@RegisterRestClient(configKey = "ords-client")
|
||||||
|
public interface OrdsClient {
|
||||||
|
|
||||||
|
@POST
|
||||||
|
@Path("${galabau.ords.process-incoming-path}")
|
||||||
|
@Consumes(MediaType.APPLICATION_JSON)
|
||||||
|
@Produces(MediaType.APPLICATION_JSON)
|
||||||
|
Response processIncoming(ProcessIncomingRequest request);
|
||||||
|
}
|
||||||
|
|
||||||
|
// application.properties:
|
||||||
|
// quarkus.rest-client.ords-client.url=${galabau.ords.base-url}
|
||||||
|
|
||||||
|
public class ProcessIncomingRequest {
|
||||||
|
public String zipName; // "export_2026-04-08"
|
||||||
|
public String runId; // für Logging-Korrelation
|
||||||
|
public LocalDateTime uploadedAt;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Fehlerbehandlung
|
||||||
|
|
||||||
|
- Schlägt der ORDS-Aufruf fehl → Marker liegt trotzdem vor
|
||||||
|
- APEX Automation findet beim nächsten Stundenlauf den Marker und verarbeitet
|
||||||
|
- **Keine** Versuche, den Fehler lokal zu beheben
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Abhängigkeiten (pom.xml)
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<properties>
|
||||||
|
<java.version>25</java.version>
|
||||||
|
<quarkus.platform.version><!-- aktuelle LTS auf code.quarkus.io prüfen --></quarkus.platform.version>
|
||||||
|
</properties>
|
||||||
|
|
||||||
|
<dependencyManagement>
|
||||||
|
<dependencies>
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus.platform</groupId>
|
||||||
|
<artifactId>quarkus-bom</artifactId>
|
||||||
|
<version>${quarkus.platform.version}</version>
|
||||||
|
<type>pom</type>
|
||||||
|
<scope>import</scope>
|
||||||
|
</dependency>
|
||||||
|
</dependencies>
|
||||||
|
</dependencyManagement>
|
||||||
|
|
||||||
|
<dependencies>
|
||||||
|
<!-- Quarkus Core -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-arc</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- REST Endpoint (eingehend von APEX) -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-rest</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- REST Client (ausgehend zu ORDS) -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-rest-client-jackson</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Managed Executor (für Async-Pipeline) -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-smallrye-context-propagation</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- SFTP: SSHJ — moderner Java SFTP/SSH Client -->
|
||||||
|
<!-- Aktuelle Version prüfen: https://mvnrepository.com/artifact/com.hierynomus/sshj -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.hierynomus</groupId>
|
||||||
|
<artifactId>sshj</artifactId>
|
||||||
|
<version>0.38.0</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- OCI Object Storage SDK -->
|
||||||
|
<!-- Aktuelle Version prüfen: https://mvnrepository.com/artifact/com.oracle.oci.sdk/oci-java-sdk-objectstorage -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.oracle.oci.sdk</groupId>
|
||||||
|
<artifactId>oci-java-sdk-objectstorage</artifactId>
|
||||||
|
<version>3.44.0</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- ZIP -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.apache.commons</groupId>
|
||||||
|
<artifactId>commons-compress</artifactId>
|
||||||
|
<version>1.26.1</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Observability -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-opentelemetry</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Fault Tolerance -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-smallrye-fault-tolerance</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Tests -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-junit5</artifactId>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
</dependencies>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Hinweis:** `oci-java-sdk-objectstorage` und `sshj` liegen nicht im Quarkus BOM —
|
||||||
|
Versionen beim Projektstart auf Maven Central prüfen. Quarkus-eigene Artefakte (`quarkus-*`)
|
||||||
|
brauchen keine `<version>`, da das BOM sie bereitstellt.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Observability & Logging
|
||||||
|
|
||||||
|
### MDC-Hierarchie
|
||||||
|
|
||||||
|
```
|
||||||
|
runId → ein ZIP-Verarbeitungslauf (gesetzt am Anfang jeder ZIP)
|
||||||
|
step → aktueller Pipeline-Step
|
||||||
|
fileIdx → welche Datei in diesem Schritt (optional)
|
||||||
|
```
|
||||||
|
|
||||||
|
MDC nach jeder ZIP mit `MDC.clear()` leeren — damit kein Kontext in die nächste ZIP sickert.
|
||||||
|
|
||||||
|
### Log-Level
|
||||||
|
|
||||||
|
| Schritt | Level | Beispiel |
|
||||||
|
|---|---|---|
|
||||||
|
| `api-trigger` | INFO | "Processing triggered by APEX, starting async pipeline" |
|
||||||
|
| `sftp-list` | INFO | "5 new ZIP files found on SFTP" |
|
||||||
|
| `zip-extract` | INFO | "ZIP 'export_2026-04-08.zip' extracted: 12 files, 5 directories" |
|
||||||
|
| `oci-upload` | INFO | "Uploaded 12 files + marker to OCI in 3.2s" |
|
||||||
|
| `ords-notify` | INFO | "ORDS endpoint called, HTTP 200" |
|
||||||
|
| `cleanup` | DEBUG | "Local files deleted: /tmp/sftp-work/export_2026-04-08.zip" |
|
||||||
|
| Fehler | ERROR | Mit vollem MDC-Context und Exception |
|
||||||
|
|
||||||
|
**Stack:** OTLP → Loki/Grafana (Entwicklung: Dev Services, Produktion: externer Collector)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Migrations-Schritte
|
||||||
|
|
||||||
|
1. **Phase 1: Entwicklung**
|
||||||
|
- Projekt-Setup (Java 25, Quarkus 3.x, SSHJ)
|
||||||
|
- REST Endpoint (`FileProcessingResource`) mit API-Key-Auth + Concurrency Guard
|
||||||
|
- SFTP-Operationen (SSHJ: list, download, rename) mit Fingerprint-Verification
|
||||||
|
- ZIP-Entpacken (Apache Commons Compress)
|
||||||
|
- OCI-Upload (OCI SDK, Instance Principal, Marker-Handling)
|
||||||
|
- ORDS-Benachrichtigung (REST Client + Retry)
|
||||||
|
- Lokaler Cleanup (always in finally)
|
||||||
|
- Fehlerbehandlung + OTLP-Logging
|
||||||
|
- Package `pck_auto_import` mit `p_process_incoming_files` auf DB-Seite implementieren
|
||||||
|
- Unit-Tests + Integration-Tests
|
||||||
|
|
||||||
|
2. **Phase 2: Testing & Deployment**
|
||||||
|
- E2E-Tests mit echtem SFTP / OCI / ORDS
|
||||||
|
- Parallel-Run: n8n und Quarkus-Backend gleichzeitig aktiv
|
||||||
|
- **APEX Automation aktualisieren:** URL von n8n-Webhook auf `http://backend:8080/api/process-incoming` ändern, `X-Api-Key`-Header ergänzen
|
||||||
|
- Grafana Dashboards + Alerts aufsetzen
|
||||||
|
|
||||||
|
3. **Phase 3: Cutover**
|
||||||
|
- n8n deaktivieren / entfernen
|
||||||
|
- Monitoring etablieren
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Deployment-Constraints
|
||||||
|
|
||||||
|
- **Einzelinstanz:** Der Service läuft als Einzelinstanz. Kein Clustering.
|
||||||
|
Mehrere Instanzen würden dieselben SFTP-Dateien gleichzeitig verarbeiten.
|
||||||
|
Der `AtomicBoolean`-Guard schützt nur innerhalb einer Instanz.
|
||||||
|
Solange APEX Automation den Webhook nur einmal pro Lauf aufruft, kein Problem.
|
||||||
|
|
||||||
|
- **Idempotenz:** OCI PUT ist idempotent. Bei abgebrochenem Lauf ohne Marker:
|
||||||
|
nächster Trigger lädt ZIP erneut, Dateien werden überschrieben — kein korrupter Zustand.
|
||||||
|
|
||||||
|
- **Disk-Space:** Lokales Arbeitsverzeichnis `/tmp/sftp-work` — Cleanup läuft immer im finally.
|
||||||
|
Maximale gleichzeitige Belegung: eine ZIP + entpackte Dateien (sequenzielle Verarbeitung).
|
||||||
|
Capacity Planning erforderlich (→ offene Frage).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Vorteile gegenüber n8n
|
||||||
|
|
||||||
|
| Aspekt | n8n | Quarkus Backend |
|
||||||
|
|---|---|---|
|
||||||
|
| **Abhängigkeiten** | Externe SaaS/self-hosted | Ein JAR |
|
||||||
|
| **Fehlerbehandlung** | UI-basiert konfigurieren | Java Code, testbar |
|
||||||
|
| **Observability** | n8n-Logs + externe Integrationen | OTLP → Loki/Grafana nativ |
|
||||||
|
| **Sicherheit** | Credentials in n8n-DB | Env-Variablen / Secrets |
|
||||||
|
| **Trigger** | n8n-Webhook | REST Endpoint (identisches Muster) |
|
||||||
|
| **Wartbarkeit** | Workflow-Änderungen in UI | Code-Reviews, Git-History |
|
||||||
|
| **Skalierung** | Horizontale Skalierung komplex | Standard Quarkus/Kubernetes |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Architektur-Entscheidungen ✅
|
||||||
|
|
||||||
|
- [x] Trigger: **REST Endpoint** (von APEX Automation aufgerufen, identisch zu bisherigem n8n-Webhook)
|
||||||
|
- [x] Scheduling: **APEX Automation** (bleibt unverändert — kein eigener Scheduler im Service)
|
||||||
|
- [x] Async: **fire & forget** (202 Accepted sofort, Pipeline im Hintergrund — wie n8n)
|
||||||
|
- [x] Endpoint-Auth: **X-Api-Key Header** (einfach, funktioniert mit APEX Web Service Calls)
|
||||||
|
- [x] Concurrency Guard: **AtomicBoolean** (kein Doppellauf bei schnellen APEX-Retries)
|
||||||
|
- [x] SFTP-Lösung: **SSHJ** (`com.hierynomus:sshj`) — leichtgewichtiger SFTP-Client
|
||||||
|
- [x] SFTP Host-Key: **Fingerprint-basiert** via `galabau.sftp.host-key-fingerprint` in Config
|
||||||
|
- [x] OCI Auth: **SimpleAuthenticationDetailsProvider** (Credentials via Env-Vars + gemountetes Kubernetes Secret für Private Key)
|
||||||
|
- [x] OCI SDK: **OCI Java SDK** (HTTP Signature V1 automatisch)
|
||||||
|
- [x] ZIP: **Apache Commons Compress**
|
||||||
|
- [x] REST Clients: **MicroProfile REST Client** (configKey-basiert)
|
||||||
|
- [x] Credentials: **`${ENV_VAR}`** in application.properties — Quarkus liest nativ aus Umgebungsvariablen
|
||||||
|
- [x] Java: **25**
|
||||||
|
- [x] Cleanup: **immer im finally** nach jeder ZIP
|
||||||
|
|
||||||
|
## Offene Fragen
|
||||||
|
|
||||||
|
- [ ] Disk-Space: Wie groß dürfen ZIPs maximal sein? → Capacity Planning
|
||||||
|
- [ ] SFTP-Auth: Password oder Public-Key für Produktion? → Public-Key empfohlen
|
||||||
|
- [ ] Monitoring: Welche Grafana-Alerts sind Pflicht? → TBD mit OPS-Team
|
||||||
291
agent-backend/docs/SFTP-Integration.md
Normal file
291
agent-backend/docs/SFTP-Integration.md
Normal file
@@ -0,0 +1,291 @@
|
|||||||
|
# SSHJ SFTP — Architektur-Übersicht
|
||||||
|
|
||||||
|
**Projekt:** n8n Replacement für Dateieingang-Verarbeitung
|
||||||
|
**Stand:** 2026-04-08
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Warum SSHJ statt Camel Quarkus FTP?
|
||||||
|
|
||||||
|
Camel Quarkus FTP ist ein EIP-/Routing-Framework, das intern selbst eine SFTP-Bibliothek
|
||||||
|
wrapped. Da der Trigger ein REST-Endpoint ist (kein eigenes Scheduling, kein Message Routing),
|
||||||
|
bringt Camel nur Overhead:
|
||||||
|
|
||||||
|
| | Camel Quarkus FTP | SSHJ |
|
||||||
|
|---|---|---|
|
||||||
|
| **Zweck** | EIP-Framework (Routing, EIP-Patterns, Scheduling) | SFTP/SSH Client |
|
||||||
|
| **Größe** | Camel Core + FTP-Komponente (~10+ MByte Abhängigkeiten) | Leichtgewichtig |
|
||||||
|
| **API** | Camel Exchange, RouteBuilder, Endpoints | Einfache Java API |
|
||||||
|
| **Wartbarkeit** | Camel-Kenntnisse nötig | Standard Java I/O-Konzepte |
|
||||||
|
| **Quarkus-Integration** | Native Extension vorhanden | Plain CDI-Bean |
|
||||||
|
| **Eignet sich für** | Komplexe Routing-Szenarien | SFTP-Operationen (list, get, rename) |
|
||||||
|
|
||||||
|
Für diesen Use-Case (list ZIPs, download, rename nach Verarbeitung) ist SSHJ die richtige Wahl.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Abhängigkeit
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<!-- Aktuelle Version prüfen: https://mvnrepository.com/artifact/com.hierynomus/sshj -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.hierynomus</groupId>
|
||||||
|
<artifactId>sshj</artifactId>
|
||||||
|
<version>0.38.0</version>
|
||||||
|
</dependency>
|
||||||
|
```
|
||||||
|
|
||||||
|
SSHJ bringt Bouncy Castle als transitive Abhängigkeit mit (für Kryptografie).
|
||||||
|
Das ist kein Problem — Bouncy Castle ist weit verbreitet und gut gepflegt.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Pipeline-Architektur
|
||||||
|
|
||||||
|
```
|
||||||
|
REST POST /api/process-incoming (X-Api-Key Header)
|
||||||
|
│
|
||||||
|
▼ 202 Accepted (sofort)
|
||||||
|
FileProcessingPipeline.processAllAsync()
|
||||||
|
│ Hintergrund-Thread (ManagedExecutor)
|
||||||
|
│
|
||||||
|
├─ SftpService.listZipFiles()
|
||||||
|
│ └─ sftp.ls(remotePath) → ["export_2026-04-08.zip", ...]
|
||||||
|
│
|
||||||
|
└─ für jede ZIP:
|
||||||
|
│
|
||||||
|
├─ SftpService.download(filename)
|
||||||
|
│ └─ sftp.get(remotePath/file, localPath)
|
||||||
|
│
|
||||||
|
├─ ZipExtractionService.extract(localFile)
|
||||||
|
│ └─ ProcessingContext mit List<FileEntry>
|
||||||
|
│
|
||||||
|
├─ OciUploadService.upload(context)
|
||||||
|
│ └─ Dateien + Marker in OCI (SimpleAuthenticationDetailsProvider)
|
||||||
|
│
|
||||||
|
├─ SftpService.renameRemote(file, file + ".processed") ← bei Erfolg
|
||||||
|
│ SftpService.renameRemote(file, file + ".error") ← bei Fehler
|
||||||
|
│
|
||||||
|
├─ OrdsNotificationService.notify(context)
|
||||||
|
│
|
||||||
|
└─ cleanup: lokale ZIP + Entpack-Verzeichnis löschen ← immer (finally)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## SftpConfig (@ConfigMapping)
|
||||||
|
|
||||||
|
```java
|
||||||
|
@ConfigMapping(prefix = "galabau.sftp")
|
||||||
|
public interface SftpConfig {
|
||||||
|
String host();
|
||||||
|
int port();
|
||||||
|
String username();
|
||||||
|
String password();
|
||||||
|
String hostKeyFingerprint(); // z.B. "SHA256:AbCdEfGh..."
|
||||||
|
String remotePath();
|
||||||
|
String localWorkDir();
|
||||||
|
Optional<String> privateKeyPath();
|
||||||
|
Optional<String> privateKeyPassphrase();
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## SftpService — Implementierungsstruktur
|
||||||
|
|
||||||
|
```java
|
||||||
|
@ApplicationScoped
|
||||||
|
public class SftpService {
|
||||||
|
|
||||||
|
@Inject
|
||||||
|
SftpConfig config;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stellt eine SFTP-Verbindung her, führt die Operation aus und trennt danach sauber.
|
||||||
|
* Host-Key wird gegen konfigurierten Fingerprint geprüft — kein PromiscuousVerifier.
|
||||||
|
*/
|
||||||
|
private <T> T withSftp(SftpOperation<T> operation) throws SftpException {
|
||||||
|
try (SSHClient ssh = new SSHClient()) {
|
||||||
|
// Fingerprint-basierte Host-Key-Verifikation (konfigurierbar, sicher)
|
||||||
|
ssh.addHostKeyVerifier(new FingerprintVerifier(config.hostKeyFingerprint()));
|
||||||
|
ssh.connect(config.host(), config.port());
|
||||||
|
|
||||||
|
if (config.privateKeyPath().isPresent()) {
|
||||||
|
// Public-Key-Auth (bevorzugt für Produktion)
|
||||||
|
ssh.authPublickey(config.username(), config.privateKeyPath().get());
|
||||||
|
} else {
|
||||||
|
// Password-Auth (Fallback / Entwicklung)
|
||||||
|
ssh.authPassword(config.username(), config.password());
|
||||||
|
}
|
||||||
|
|
||||||
|
try (SFTPClient sftp = ssh.newSFTPClient()) {
|
||||||
|
return operation.execute(sftp);
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
throw new SftpException("SFTP operation failed: " + e.getMessage(), e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> listZipFiles() throws SftpException {
|
||||||
|
return withSftp(sftp ->
|
||||||
|
sftp.ls(config.remotePath()).stream()
|
||||||
|
.filter(entry -> entry.getName().endsWith(".zip"))
|
||||||
|
.map(RemoteResourceInfo::getName)
|
||||||
|
.toList()
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
public Path download(String filename) throws SftpException {
|
||||||
|
Path localFile = Path.of(config.localWorkDir(), filename);
|
||||||
|
withSftp(sftp -> {
|
||||||
|
sftp.get(config.remotePath() + "/" + filename, localFile.toString());
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
return localFile;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void renameRemote(String filename, String newFilename) throws SftpException {
|
||||||
|
withSftp(sftp -> {
|
||||||
|
sftp.rename(
|
||||||
|
config.remotePath() + "/" + filename,
|
||||||
|
config.remotePath() + "/" + newFilename
|
||||||
|
);
|
||||||
|
return null;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@FunctionalInterface
|
||||||
|
interface SftpOperation<T> {
|
||||||
|
T execute(SFTPClient sftp) throws IOException;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Host-Key-Fingerprint ermitteln
|
||||||
|
|
||||||
|
Einmalig ausführen (z.B. aus WSL oder Git Bash), bevor der Service deployed wird:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Alle Host Keys des Servers anzeigen (SHA256 Fingerprints):
|
||||||
|
ssh-keyscan sftp.lieferant.de 2>/dev/null | ssh-keygen -lf -
|
||||||
|
|
||||||
|
# Ausgabe z.B.:
|
||||||
|
# 256 SHA256:AbCdEfGhIjKlMnOpQrStUv... sftp.lieferant.de (ED25519)
|
||||||
|
# 2048 SHA256:XyZaBcDeFgHiJk... sftp.lieferant.de (RSA)
|
||||||
|
```
|
||||||
|
|
||||||
|
Den `SHA256:...`-Teil in `galabau.sftp.host-key-fingerprint` eintragen.
|
||||||
|
**ED25519 bevorzugen** (stärker und kürzer als RSA).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Fehlerbehandlung
|
||||||
|
|
||||||
|
### SFTP-Verbindungsfehler (transient)
|
||||||
|
|
||||||
|
```
|
||||||
|
IOException beim connect/auth/ls
|
||||||
|
→ SftpException (transient)
|
||||||
|
→ Pipeline bricht ab
|
||||||
|
→ Nächster APEX-Lauf (1h) versucht es erneut
|
||||||
|
→ Alle ZIPs bleiben auf SFTP unverändert
|
||||||
|
```
|
||||||
|
|
||||||
|
### Fehler während ZIP-Verarbeitung
|
||||||
|
|
||||||
|
```
|
||||||
|
ZIP entpackbar?
|
||||||
|
Nein → ZIP umbenennen zu .error, weiter mit nächster ZIP
|
||||||
|
|
||||||
|
OCI Upload erfolgreich?
|
||||||
|
Nein (transient 503) → @Retry (3x), danach ZIP zu .error
|
||||||
|
Nein (persistent 403) → ZIP zu .error, Log — Credentials prüfen!
|
||||||
|
|
||||||
|
ORDS-Aufruf fehlgeschlagen?
|
||||||
|
Marker liegt vor → kein Problem, APEX Automation findet ihn nächsten Lauf
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cleanup (immer)
|
||||||
|
|
||||||
|
```java
|
||||||
|
try {
|
||||||
|
// ... Verarbeitung ...
|
||||||
|
} finally {
|
||||||
|
// Lokale Dateien immer löschen — egal ob Erfolg oder Fehler
|
||||||
|
Files.deleteIfExists(context.localZipPath);
|
||||||
|
deleteDirectory(context.localExtractDir);
|
||||||
|
MDC.clear();
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Transient vs. Persistent Fehler
|
||||||
|
|
||||||
|
| Fehler | Typ | Handling |
|
||||||
|
|---|---|---|
|
||||||
|
| SFTP-Verbindung timeout/refused | transient | Nächster APEX-Lauf |
|
||||||
|
| OCI 503 Service Unavailable | transient | @Retry (3x mit Backoff) |
|
||||||
|
| ZIP beschädigt | persistent | `.error`, Log |
|
||||||
|
| OCI 403 Auth Failed | persistent | `.error`, Log — Config prüfen! |
|
||||||
|
| SFTP-Rename fehlgeschlagen | transient | Log, ZIP bleibt — nächster Lauf |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## n8n → Quarkus Service: Feature-Mapping
|
||||||
|
|
||||||
|
| n8n | Quarkus/SSHJ | Anmerkung |
|
||||||
|
|---|---|---|
|
||||||
|
| SFTP Trigger (Polling) | entfällt — REST Endpoint | Scheduling bleibt bei APEX Automation |
|
||||||
|
| SFTP Download | `SftpService.download()` | SSHJ `sftp.get()` |
|
||||||
|
| Unzip | `ZipExtractionService` | Apache Commons Compress |
|
||||||
|
| HTTP PUT (OCI) | `OciUploadService` | OCI SDK, Instance Principal |
|
||||||
|
| HTTP POST (ORDS) | `OrdsNotificationService` | MicroProfile REST Client |
|
||||||
|
| Try/Catch | Try/Catch + @Retry | Kein Framework-Overhead |
|
||||||
|
| Rename | `SftpService.renameRemote()` | SSHJ `sftp.rename()` |
|
||||||
|
| Error Notification | Log + OTLP | Loki/Grafana Alert |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Wartungs-Punkte
|
||||||
|
|
||||||
|
### Code-Organisation
|
||||||
|
- **`SftpService`:** Nur SFTP-Operationen (list, download, rename) — keine Business-Logik
|
||||||
|
- **`FileProcessingPipeline`:** Einziger Orchestrierer — kennt alle Services, steuert den Ablauf
|
||||||
|
- **Services:** Keine Abhängigkeiten untereinander — jeder isoliert testbar
|
||||||
|
- **Config:** `@ConfigMapping` — alle Werte aus `application.properties` / Umgebungsvariablen
|
||||||
|
|
||||||
|
### Testing-Strategie
|
||||||
|
- `SftpService`: Testcontainers + echtem SFTP-Container (`atmoz/sftp` auf Docker Hub)
|
||||||
|
- `ZipExtractionService`, `OciUploadService`: Unit-Tests mit Testdaten / Mocks
|
||||||
|
- `FileProcessingPipeline`: Integration-Test mit Mock-Services
|
||||||
|
- E2E: gegen Staging-SFTP und Staging-OCI
|
||||||
|
|
||||||
|
### Monitoring
|
||||||
|
- OTLP → Loki (via Quarkus OpenTelemetry)
|
||||||
|
- Dashboards: ZIP-Verarbeitungsrate, Fehler-Rate, Latenz pro Step
|
||||||
|
- Alerts: `.error`-Dateien > Schwellwert, ORDS-Ausfälle, Pipeline-Laufzeit > Schwellwert
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Konfiguration (Zusammenfassung)
|
||||||
|
|
||||||
|
```properties
|
||||||
|
# Host-Key Fingerprint (ssh-keyscan host | ssh-keygen -lf -)
|
||||||
|
galabau.sftp.host-key-fingerprint=SHA256:AbCdEfGh...
|
||||||
|
|
||||||
|
# Password aus Env-Var (Quarkus löst ${...} nativ auf)
|
||||||
|
galabau.sftp.password=${GALABAU_SFTP_PASSWORD}
|
||||||
|
|
||||||
|
# OCI: Credentials aus Env-Vars, Private Key als gemountetes Kubernetes Secret
|
||||||
|
# galabau.oci.tenancy-id, user-id, fingerprint aus ${ENV_VAR}
|
||||||
|
# galabau.oci.private-key-path zeigt auf gemountete PEM-Datei
|
||||||
|
|
||||||
|
# API-Absicherung des Endpoints
|
||||||
|
galabau.api.key=${GALABAU_API_KEY}
|
||||||
|
```
|
||||||
|
|
||||||
|
Siehe `plan_n8n_replacement_quarkus.md` für vollständige Konfigurationsübersicht.
|
||||||
295
agent-backend/mvnw
vendored
Normal file
295
agent-backend/mvnw
vendored
Normal file
@@ -0,0 +1,295 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Licensed to the Apache Software Foundation (ASF) under one
|
||||||
|
# or more contributor license agreements. See the NOTICE file
|
||||||
|
# distributed with this work for additional information
|
||||||
|
# regarding copyright ownership. The ASF licenses this file
|
||||||
|
# to you under the Apache License, Version 2.0 (the
|
||||||
|
# "License"); you may not use this file except in compliance
|
||||||
|
# with the License. You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing,
|
||||||
|
# software distributed under the License is distributed on an
|
||||||
|
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||||
|
# KIND, either express or implied. See the License for the
|
||||||
|
# specific language governing permissions and limitations
|
||||||
|
# under the License.
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
# Apache Maven Wrapper startup batch script, version 3.3.4
|
||||||
|
#
|
||||||
|
# Optional ENV vars
|
||||||
|
# -----------------
|
||||||
|
# JAVA_HOME - location of a JDK home dir, required when download maven via java source
|
||||||
|
# MVNW_REPOURL - repo url base for downloading maven distribution
|
||||||
|
# MVNW_USERNAME/MVNW_PASSWORD - user and password for downloading maven
|
||||||
|
# MVNW_VERBOSE - true: enable verbose log; debug: trace the mvnw script; others: silence the output
|
||||||
|
# ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
set -euf
|
||||||
|
[ "${MVNW_VERBOSE-}" != debug ] || set -x
|
||||||
|
|
||||||
|
# OS specific support.
|
||||||
|
native_path() { printf %s\\n "$1"; }
|
||||||
|
case "$(uname)" in
|
||||||
|
CYGWIN* | MINGW*)
|
||||||
|
[ -z "${JAVA_HOME-}" ] || JAVA_HOME="$(cygpath --unix "$JAVA_HOME")"
|
||||||
|
native_path() { cygpath --path --windows "$1"; }
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
# set JAVACMD and JAVACCMD
|
||||||
|
set_java_home() {
|
||||||
|
# For Cygwin and MinGW, ensure paths are in Unix format before anything is touched
|
||||||
|
if [ -n "${JAVA_HOME-}" ]; then
|
||||||
|
if [ -x "$JAVA_HOME/jre/sh/java" ]; then
|
||||||
|
# IBM's JDK on AIX uses strange locations for the executables
|
||||||
|
JAVACMD="$JAVA_HOME/jre/sh/java"
|
||||||
|
JAVACCMD="$JAVA_HOME/jre/sh/javac"
|
||||||
|
else
|
||||||
|
JAVACMD="$JAVA_HOME/bin/java"
|
||||||
|
JAVACCMD="$JAVA_HOME/bin/javac"
|
||||||
|
|
||||||
|
if [ ! -x "$JAVACMD" ] || [ ! -x "$JAVACCMD" ]; then
|
||||||
|
echo "The JAVA_HOME environment variable is not defined correctly, so mvnw cannot run." >&2
|
||||||
|
echo "JAVA_HOME is set to \"$JAVA_HOME\", but \"\$JAVA_HOME/bin/java\" or \"\$JAVA_HOME/bin/javac\" does not exist." >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
JAVACMD="$(
|
||||||
|
'set' +e
|
||||||
|
'unset' -f command 2>/dev/null
|
||||||
|
'command' -v java
|
||||||
|
)" || :
|
||||||
|
JAVACCMD="$(
|
||||||
|
'set' +e
|
||||||
|
'unset' -f command 2>/dev/null
|
||||||
|
'command' -v javac
|
||||||
|
)" || :
|
||||||
|
|
||||||
|
if [ ! -x "${JAVACMD-}" ] || [ ! -x "${JAVACCMD-}" ]; then
|
||||||
|
echo "The java/javac command does not exist in PATH nor is JAVA_HOME set, so mvnw cannot run." >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# hash string like Java String::hashCode
|
||||||
|
hash_string() {
|
||||||
|
str="${1:-}" h=0
|
||||||
|
while [ -n "$str" ]; do
|
||||||
|
char="${str%"${str#?}"}"
|
||||||
|
h=$(((h * 31 + $(LC_CTYPE=C printf %d "'$char")) % 4294967296))
|
||||||
|
str="${str#?}"
|
||||||
|
done
|
||||||
|
printf %x\\n $h
|
||||||
|
}
|
||||||
|
|
||||||
|
verbose() { :; }
|
||||||
|
[ "${MVNW_VERBOSE-}" != true ] || verbose() { printf %s\\n "${1-}"; }
|
||||||
|
|
||||||
|
die() {
|
||||||
|
printf %s\\n "$1" >&2
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
trim() {
|
||||||
|
# MWRAPPER-139:
|
||||||
|
# Trims trailing and leading whitespace, carriage returns, tabs, and linefeeds.
|
||||||
|
# Needed for removing poorly interpreted newline sequences when running in more
|
||||||
|
# exotic environments such as mingw bash on Windows.
|
||||||
|
printf "%s" "${1}" | tr -d '[:space:]'
|
||||||
|
}
|
||||||
|
|
||||||
|
scriptDir="$(dirname "$0")"
|
||||||
|
scriptName="$(basename "$0")"
|
||||||
|
|
||||||
|
# parse distributionUrl and optional distributionSha256Sum, requires .mvn/wrapper/maven-wrapper.properties
|
||||||
|
while IFS="=" read -r key value; do
|
||||||
|
case "${key-}" in
|
||||||
|
distributionUrl) distributionUrl=$(trim "${value-}") ;;
|
||||||
|
distributionSha256Sum) distributionSha256Sum=$(trim "${value-}") ;;
|
||||||
|
esac
|
||||||
|
done <"$scriptDir/.mvn/wrapper/maven-wrapper.properties"
|
||||||
|
[ -n "${distributionUrl-}" ] || die "cannot read distributionUrl property in $scriptDir/.mvn/wrapper/maven-wrapper.properties"
|
||||||
|
|
||||||
|
case "${distributionUrl##*/}" in
|
||||||
|
maven-mvnd-*bin.*)
|
||||||
|
MVN_CMD=mvnd.sh _MVNW_REPO_PATTERN=/maven/mvnd/
|
||||||
|
case "${PROCESSOR_ARCHITECTURE-}${PROCESSOR_ARCHITEW6432-}:$(uname -a)" in
|
||||||
|
*AMD64:CYGWIN* | *AMD64:MINGW*) distributionPlatform=windows-amd64 ;;
|
||||||
|
:Darwin*x86_64) distributionPlatform=darwin-amd64 ;;
|
||||||
|
:Darwin*arm64) distributionPlatform=darwin-aarch64 ;;
|
||||||
|
:Linux*x86_64*) distributionPlatform=linux-amd64 ;;
|
||||||
|
*)
|
||||||
|
echo "Cannot detect native platform for mvnd on $(uname)-$(uname -m), use pure java version" >&2
|
||||||
|
distributionPlatform=linux-amd64
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
distributionUrl="${distributionUrl%-bin.*}-$distributionPlatform.zip"
|
||||||
|
;;
|
||||||
|
maven-mvnd-*) MVN_CMD=mvnd.sh _MVNW_REPO_PATTERN=/maven/mvnd/ ;;
|
||||||
|
*) MVN_CMD="mvn${scriptName#mvnw}" _MVNW_REPO_PATTERN=/org/apache/maven/ ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
# apply MVNW_REPOURL and calculate MAVEN_HOME
|
||||||
|
# maven home pattern: ~/.m2/wrapper/dists/{apache-maven-<version>,maven-mvnd-<version>-<platform>}/<hash>
|
||||||
|
[ -z "${MVNW_REPOURL-}" ] || distributionUrl="$MVNW_REPOURL$_MVNW_REPO_PATTERN${distributionUrl#*"$_MVNW_REPO_PATTERN"}"
|
||||||
|
distributionUrlName="${distributionUrl##*/}"
|
||||||
|
distributionUrlNameMain="${distributionUrlName%.*}"
|
||||||
|
distributionUrlNameMain="${distributionUrlNameMain%-bin}"
|
||||||
|
MAVEN_USER_HOME="${MAVEN_USER_HOME:-${HOME}/.m2}"
|
||||||
|
MAVEN_HOME="${MAVEN_USER_HOME}/wrapper/dists/${distributionUrlNameMain-}/$(hash_string "$distributionUrl")"
|
||||||
|
|
||||||
|
exec_maven() {
|
||||||
|
unset MVNW_VERBOSE MVNW_USERNAME MVNW_PASSWORD MVNW_REPOURL || :
|
||||||
|
exec "$MAVEN_HOME/bin/$MVN_CMD" "$@" || die "cannot exec $MAVEN_HOME/bin/$MVN_CMD"
|
||||||
|
}
|
||||||
|
|
||||||
|
if [ -d "$MAVEN_HOME" ]; then
|
||||||
|
verbose "found existing MAVEN_HOME at $MAVEN_HOME"
|
||||||
|
exec_maven "$@"
|
||||||
|
fi
|
||||||
|
|
||||||
|
case "${distributionUrl-}" in
|
||||||
|
*?-bin.zip | *?maven-mvnd-?*-?*.zip) ;;
|
||||||
|
*) die "distributionUrl is not valid, must match *-bin.zip or maven-mvnd-*.zip, but found '${distributionUrl-}'" ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
# prepare tmp dir
|
||||||
|
if TMP_DOWNLOAD_DIR="$(mktemp -d)" && [ -d "$TMP_DOWNLOAD_DIR" ]; then
|
||||||
|
clean() { rm -rf -- "$TMP_DOWNLOAD_DIR"; }
|
||||||
|
trap clean HUP INT TERM EXIT
|
||||||
|
else
|
||||||
|
die "cannot create temp dir"
|
||||||
|
fi
|
||||||
|
|
||||||
|
mkdir -p -- "${MAVEN_HOME%/*}"
|
||||||
|
|
||||||
|
# Download and Install Apache Maven
|
||||||
|
verbose "Couldn't find MAVEN_HOME, downloading and installing it ..."
|
||||||
|
verbose "Downloading from: $distributionUrl"
|
||||||
|
verbose "Downloading to: $TMP_DOWNLOAD_DIR/$distributionUrlName"
|
||||||
|
|
||||||
|
# select .zip or .tar.gz
|
||||||
|
if ! command -v unzip >/dev/null; then
|
||||||
|
distributionUrl="${distributionUrl%.zip}.tar.gz"
|
||||||
|
distributionUrlName="${distributionUrl##*/}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# verbose opt
|
||||||
|
__MVNW_QUIET_WGET=--quiet __MVNW_QUIET_CURL=--silent __MVNW_QUIET_UNZIP=-q __MVNW_QUIET_TAR=''
|
||||||
|
[ "${MVNW_VERBOSE-}" != true ] || __MVNW_QUIET_WGET='' __MVNW_QUIET_CURL='' __MVNW_QUIET_UNZIP='' __MVNW_QUIET_TAR=v
|
||||||
|
|
||||||
|
# normalize http auth
|
||||||
|
case "${MVNW_PASSWORD:+has-password}" in
|
||||||
|
'') MVNW_USERNAME='' MVNW_PASSWORD='' ;;
|
||||||
|
has-password) [ -n "${MVNW_USERNAME-}" ] || MVNW_USERNAME='' MVNW_PASSWORD='' ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
if [ -z "${MVNW_USERNAME-}" ] && command -v wget >/dev/null; then
|
||||||
|
verbose "Found wget ... using wget"
|
||||||
|
wget ${__MVNW_QUIET_WGET:+"$__MVNW_QUIET_WGET"} "$distributionUrl" -O "$TMP_DOWNLOAD_DIR/$distributionUrlName" || die "wget: Failed to fetch $distributionUrl"
|
||||||
|
elif [ -z "${MVNW_USERNAME-}" ] && command -v curl >/dev/null; then
|
||||||
|
verbose "Found curl ... using curl"
|
||||||
|
curl ${__MVNW_QUIET_CURL:+"$__MVNW_QUIET_CURL"} -f -L -o "$TMP_DOWNLOAD_DIR/$distributionUrlName" "$distributionUrl" || die "curl: Failed to fetch $distributionUrl"
|
||||||
|
elif set_java_home; then
|
||||||
|
verbose "Falling back to use Java to download"
|
||||||
|
javaSource="$TMP_DOWNLOAD_DIR/Downloader.java"
|
||||||
|
targetZip="$TMP_DOWNLOAD_DIR/$distributionUrlName"
|
||||||
|
cat >"$javaSource" <<-END
|
||||||
|
public class Downloader extends java.net.Authenticator
|
||||||
|
{
|
||||||
|
protected java.net.PasswordAuthentication getPasswordAuthentication()
|
||||||
|
{
|
||||||
|
return new java.net.PasswordAuthentication( System.getenv( "MVNW_USERNAME" ), System.getenv( "MVNW_PASSWORD" ).toCharArray() );
|
||||||
|
}
|
||||||
|
public static void main( String[] args ) throws Exception
|
||||||
|
{
|
||||||
|
setDefault( new Downloader() );
|
||||||
|
java.nio.file.Files.copy( java.net.URI.create( args[0] ).toURL().openStream(), java.nio.file.Paths.get( args[1] ).toAbsolutePath().normalize() );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
END
|
||||||
|
# For Cygwin/MinGW, switch paths to Windows format before running javac and java
|
||||||
|
verbose " - Compiling Downloader.java ..."
|
||||||
|
"$(native_path "$JAVACCMD")" "$(native_path "$javaSource")" || die "Failed to compile Downloader.java"
|
||||||
|
verbose " - Running Downloader.java ..."
|
||||||
|
"$(native_path "$JAVACMD")" -cp "$(native_path "$TMP_DOWNLOAD_DIR")" Downloader "$distributionUrl" "$(native_path "$targetZip")"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# If specified, validate the SHA-256 sum of the Maven distribution zip file
|
||||||
|
if [ -n "${distributionSha256Sum-}" ]; then
|
||||||
|
distributionSha256Result=false
|
||||||
|
if [ "$MVN_CMD" = mvnd.sh ]; then
|
||||||
|
echo "Checksum validation is not supported for maven-mvnd." >&2
|
||||||
|
echo "Please disable validation by removing 'distributionSha256Sum' from your maven-wrapper.properties." >&2
|
||||||
|
exit 1
|
||||||
|
elif command -v sha256sum >/dev/null; then
|
||||||
|
if echo "$distributionSha256Sum $TMP_DOWNLOAD_DIR/$distributionUrlName" | sha256sum -c - >/dev/null 2>&1; then
|
||||||
|
distributionSha256Result=true
|
||||||
|
fi
|
||||||
|
elif command -v shasum >/dev/null; then
|
||||||
|
if echo "$distributionSha256Sum $TMP_DOWNLOAD_DIR/$distributionUrlName" | shasum -a 256 -c >/dev/null 2>&1; then
|
||||||
|
distributionSha256Result=true
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
echo "Checksum validation was requested but neither 'sha256sum' or 'shasum' are available." >&2
|
||||||
|
echo "Please install either command, or disable validation by removing 'distributionSha256Sum' from your maven-wrapper.properties." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if [ $distributionSha256Result = false ]; then
|
||||||
|
echo "Error: Failed to validate Maven distribution SHA-256, your Maven distribution might be compromised." >&2
|
||||||
|
echo "If you updated your Maven version, you need to update the specified distributionSha256Sum property." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# unzip and move
|
||||||
|
if command -v unzip >/dev/null; then
|
||||||
|
unzip ${__MVNW_QUIET_UNZIP:+"$__MVNW_QUIET_UNZIP"} "$TMP_DOWNLOAD_DIR/$distributionUrlName" -d "$TMP_DOWNLOAD_DIR" || die "failed to unzip"
|
||||||
|
else
|
||||||
|
tar xzf${__MVNW_QUIET_TAR:+"$__MVNW_QUIET_TAR"} "$TMP_DOWNLOAD_DIR/$distributionUrlName" -C "$TMP_DOWNLOAD_DIR" || die "failed to untar"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Find the actual extracted directory name (handles snapshots where filename != directory name)
|
||||||
|
actualDistributionDir=""
|
||||||
|
|
||||||
|
# First try the expected directory name (for regular distributions)
|
||||||
|
if [ -d "$TMP_DOWNLOAD_DIR/$distributionUrlNameMain" ]; then
|
||||||
|
if [ -f "$TMP_DOWNLOAD_DIR/$distributionUrlNameMain/bin/$MVN_CMD" ]; then
|
||||||
|
actualDistributionDir="$distributionUrlNameMain"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# If not found, search for any directory with the Maven executable (for snapshots)
|
||||||
|
if [ -z "$actualDistributionDir" ]; then
|
||||||
|
# enable globbing to iterate over items
|
||||||
|
set +f
|
||||||
|
for dir in "$TMP_DOWNLOAD_DIR"/*; do
|
||||||
|
if [ -d "$dir" ]; then
|
||||||
|
if [ -f "$dir/bin/$MVN_CMD" ]; then
|
||||||
|
actualDistributionDir="$(basename "$dir")"
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
set -f
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -z "$actualDistributionDir" ]; then
|
||||||
|
verbose "Contents of $TMP_DOWNLOAD_DIR:"
|
||||||
|
verbose "$(ls -la "$TMP_DOWNLOAD_DIR")"
|
||||||
|
die "Could not find Maven distribution directory in extracted archive"
|
||||||
|
fi
|
||||||
|
|
||||||
|
verbose "Found extracted Maven distribution directory: $actualDistributionDir"
|
||||||
|
printf %s\\n "$distributionUrl" >"$TMP_DOWNLOAD_DIR/$actualDistributionDir/mvnw.url"
|
||||||
|
mv -- "$TMP_DOWNLOAD_DIR/$actualDistributionDir" "$MAVEN_HOME" || [ -d "$MAVEN_HOME" ] || die "fail to move MAVEN_HOME"
|
||||||
|
|
||||||
|
clean || :
|
||||||
|
exec_maven "$@"
|
||||||
189
agent-backend/mvnw.cmd
vendored
Normal file
189
agent-backend/mvnw.cmd
vendored
Normal file
@@ -0,0 +1,189 @@
|
|||||||
|
<# : batch portion
|
||||||
|
@REM ----------------------------------------------------------------------------
|
||||||
|
@REM Licensed to the Apache Software Foundation (ASF) under one
|
||||||
|
@REM or more contributor license agreements. See the NOTICE file
|
||||||
|
@REM distributed with this work for additional information
|
||||||
|
@REM regarding copyright ownership. The ASF licenses this file
|
||||||
|
@REM to you under the Apache License, Version 2.0 (the
|
||||||
|
@REM "License"); you may not use this file except in compliance
|
||||||
|
@REM with the License. You may obtain a copy of the License at
|
||||||
|
@REM
|
||||||
|
@REM http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
@REM
|
||||||
|
@REM Unless required by applicable law or agreed to in writing,
|
||||||
|
@REM software distributed under the License is distributed on an
|
||||||
|
@REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||||
|
@REM KIND, either express or implied. See the License for the
|
||||||
|
@REM specific language governing permissions and limitations
|
||||||
|
@REM under the License.
|
||||||
|
@REM ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@REM ----------------------------------------------------------------------------
|
||||||
|
@REM Apache Maven Wrapper startup batch script, version 3.3.4
|
||||||
|
@REM
|
||||||
|
@REM Optional ENV vars
|
||||||
|
@REM MVNW_REPOURL - repo url base for downloading maven distribution
|
||||||
|
@REM MVNW_USERNAME/MVNW_PASSWORD - user and password for downloading maven
|
||||||
|
@REM MVNW_VERBOSE - true: enable verbose log; others: silence the output
|
||||||
|
@REM ----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
@IF "%__MVNW_ARG0_NAME__%"=="" (SET __MVNW_ARG0_NAME__=%~nx0)
|
||||||
|
@SET __MVNW_CMD__=
|
||||||
|
@SET __MVNW_ERROR__=
|
||||||
|
@SET __MVNW_PSMODULEP_SAVE=%PSModulePath%
|
||||||
|
@SET PSModulePath=
|
||||||
|
@FOR /F "usebackq tokens=1* delims==" %%A IN (`powershell -noprofile "& {$scriptDir='%~dp0'; $script='%__MVNW_ARG0_NAME__%'; icm -ScriptBlock ([Scriptblock]::Create((Get-Content -Raw '%~f0'))) -NoNewScope}"`) DO @(
|
||||||
|
IF "%%A"=="MVN_CMD" (set __MVNW_CMD__=%%B) ELSE IF "%%B"=="" (echo %%A) ELSE (echo %%A=%%B)
|
||||||
|
)
|
||||||
|
@SET PSModulePath=%__MVNW_PSMODULEP_SAVE%
|
||||||
|
@SET __MVNW_PSMODULEP_SAVE=
|
||||||
|
@SET __MVNW_ARG0_NAME__=
|
||||||
|
@SET MVNW_USERNAME=
|
||||||
|
@SET MVNW_PASSWORD=
|
||||||
|
@IF NOT "%__MVNW_CMD__%"=="" ("%__MVNW_CMD__%" %*)
|
||||||
|
@echo Cannot start maven from wrapper >&2 && exit /b 1
|
||||||
|
@GOTO :EOF
|
||||||
|
: end batch / begin powershell #>
|
||||||
|
|
||||||
|
$ErrorActionPreference = "Stop"
|
||||||
|
if ($env:MVNW_VERBOSE -eq "true") {
|
||||||
|
$VerbosePreference = "Continue"
|
||||||
|
}
|
||||||
|
|
||||||
|
# calculate distributionUrl, requires .mvn/wrapper/maven-wrapper.properties
|
||||||
|
$distributionUrl = (Get-Content -Raw "$scriptDir/.mvn/wrapper/maven-wrapper.properties" | ConvertFrom-StringData).distributionUrl
|
||||||
|
if (!$distributionUrl) {
|
||||||
|
Write-Error "cannot read distributionUrl property in $scriptDir/.mvn/wrapper/maven-wrapper.properties"
|
||||||
|
}
|
||||||
|
|
||||||
|
switch -wildcard -casesensitive ( $($distributionUrl -replace '^.*/','') ) {
|
||||||
|
"maven-mvnd-*" {
|
||||||
|
$USE_MVND = $true
|
||||||
|
$distributionUrl = $distributionUrl -replace '-bin\.[^.]*$',"-windows-amd64.zip"
|
||||||
|
$MVN_CMD = "mvnd.cmd"
|
||||||
|
break
|
||||||
|
}
|
||||||
|
default {
|
||||||
|
$USE_MVND = $false
|
||||||
|
$MVN_CMD = $script -replace '^mvnw','mvn'
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# apply MVNW_REPOURL and calculate MAVEN_HOME
|
||||||
|
# maven home pattern: ~/.m2/wrapper/dists/{apache-maven-<version>,maven-mvnd-<version>-<platform>}/<hash>
|
||||||
|
if ($env:MVNW_REPOURL) {
|
||||||
|
$MVNW_REPO_PATTERN = if ($USE_MVND -eq $False) { "/org/apache/maven/" } else { "/maven/mvnd/" }
|
||||||
|
$distributionUrl = "$env:MVNW_REPOURL$MVNW_REPO_PATTERN$($distributionUrl -replace "^.*$MVNW_REPO_PATTERN",'')"
|
||||||
|
}
|
||||||
|
$distributionUrlName = $distributionUrl -replace '^.*/',''
|
||||||
|
$distributionUrlNameMain = $distributionUrlName -replace '\.[^.]*$','' -replace '-bin$',''
|
||||||
|
|
||||||
|
$MAVEN_M2_PATH = "$HOME/.m2"
|
||||||
|
if ($env:MAVEN_USER_HOME) {
|
||||||
|
$MAVEN_M2_PATH = "$env:MAVEN_USER_HOME"
|
||||||
|
}
|
||||||
|
|
||||||
|
if (-not (Test-Path -Path $MAVEN_M2_PATH)) {
|
||||||
|
New-Item -Path $MAVEN_M2_PATH -ItemType Directory | Out-Null
|
||||||
|
}
|
||||||
|
|
||||||
|
$MAVEN_WRAPPER_DISTS = $null
|
||||||
|
if ((Get-Item $MAVEN_M2_PATH).Target[0] -eq $null) {
|
||||||
|
$MAVEN_WRAPPER_DISTS = "$MAVEN_M2_PATH/wrapper/dists"
|
||||||
|
} else {
|
||||||
|
$MAVEN_WRAPPER_DISTS = (Get-Item $MAVEN_M2_PATH).Target[0] + "/wrapper/dists"
|
||||||
|
}
|
||||||
|
|
||||||
|
$MAVEN_HOME_PARENT = "$MAVEN_WRAPPER_DISTS/$distributionUrlNameMain"
|
||||||
|
$MAVEN_HOME_NAME = ([System.Security.Cryptography.SHA256]::Create().ComputeHash([byte[]][char[]]$distributionUrl) | ForEach-Object {$_.ToString("x2")}) -join ''
|
||||||
|
$MAVEN_HOME = "$MAVEN_HOME_PARENT/$MAVEN_HOME_NAME"
|
||||||
|
|
||||||
|
if (Test-Path -Path "$MAVEN_HOME" -PathType Container) {
|
||||||
|
Write-Verbose "found existing MAVEN_HOME at $MAVEN_HOME"
|
||||||
|
Write-Output "MVN_CMD=$MAVEN_HOME/bin/$MVN_CMD"
|
||||||
|
exit $?
|
||||||
|
}
|
||||||
|
|
||||||
|
if (! $distributionUrlNameMain -or ($distributionUrlName -eq $distributionUrlNameMain)) {
|
||||||
|
Write-Error "distributionUrl is not valid, must end with *-bin.zip, but found $distributionUrl"
|
||||||
|
}
|
||||||
|
|
||||||
|
# prepare tmp dir
|
||||||
|
$TMP_DOWNLOAD_DIR_HOLDER = New-TemporaryFile
|
||||||
|
$TMP_DOWNLOAD_DIR = New-Item -Itemtype Directory -Path "$TMP_DOWNLOAD_DIR_HOLDER.dir"
|
||||||
|
$TMP_DOWNLOAD_DIR_HOLDER.Delete() | Out-Null
|
||||||
|
trap {
|
||||||
|
if ($TMP_DOWNLOAD_DIR.Exists) {
|
||||||
|
try { Remove-Item $TMP_DOWNLOAD_DIR -Recurse -Force | Out-Null }
|
||||||
|
catch { Write-Warning "Cannot remove $TMP_DOWNLOAD_DIR" }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
New-Item -Itemtype Directory -Path "$MAVEN_HOME_PARENT" -Force | Out-Null
|
||||||
|
|
||||||
|
# Download and Install Apache Maven
|
||||||
|
Write-Verbose "Couldn't find MAVEN_HOME, downloading and installing it ..."
|
||||||
|
Write-Verbose "Downloading from: $distributionUrl"
|
||||||
|
Write-Verbose "Downloading to: $TMP_DOWNLOAD_DIR/$distributionUrlName"
|
||||||
|
|
||||||
|
$webclient = New-Object System.Net.WebClient
|
||||||
|
if ($env:MVNW_USERNAME -and $env:MVNW_PASSWORD) {
|
||||||
|
$webclient.Credentials = New-Object System.Net.NetworkCredential($env:MVNW_USERNAME, $env:MVNW_PASSWORD)
|
||||||
|
}
|
||||||
|
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
|
||||||
|
$webclient.DownloadFile($distributionUrl, "$TMP_DOWNLOAD_DIR/$distributionUrlName") | Out-Null
|
||||||
|
|
||||||
|
# If specified, validate the SHA-256 sum of the Maven distribution zip file
|
||||||
|
$distributionSha256Sum = (Get-Content -Raw "$scriptDir/.mvn/wrapper/maven-wrapper.properties" | ConvertFrom-StringData).distributionSha256Sum
|
||||||
|
if ($distributionSha256Sum) {
|
||||||
|
if ($USE_MVND) {
|
||||||
|
Write-Error "Checksum validation is not supported for maven-mvnd. `nPlease disable validation by removing 'distributionSha256Sum' from your maven-wrapper.properties."
|
||||||
|
}
|
||||||
|
Import-Module $PSHOME\Modules\Microsoft.PowerShell.Utility -Function Get-FileHash
|
||||||
|
if ((Get-FileHash "$TMP_DOWNLOAD_DIR/$distributionUrlName" -Algorithm SHA256).Hash.ToLower() -ne $distributionSha256Sum) {
|
||||||
|
Write-Error "Error: Failed to validate Maven distribution SHA-256, your Maven distribution might be compromised. If you updated your Maven version, you need to update the specified distributionSha256Sum property."
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# unzip and move
|
||||||
|
Expand-Archive "$TMP_DOWNLOAD_DIR/$distributionUrlName" -DestinationPath "$TMP_DOWNLOAD_DIR" | Out-Null
|
||||||
|
|
||||||
|
# Find the actual extracted directory name (handles snapshots where filename != directory name)
|
||||||
|
$actualDistributionDir = ""
|
||||||
|
|
||||||
|
# First try the expected directory name (for regular distributions)
|
||||||
|
$expectedPath = Join-Path "$TMP_DOWNLOAD_DIR" "$distributionUrlNameMain"
|
||||||
|
$expectedMvnPath = Join-Path "$expectedPath" "bin/$MVN_CMD"
|
||||||
|
if ((Test-Path -Path $expectedPath -PathType Container) -and (Test-Path -Path $expectedMvnPath -PathType Leaf)) {
|
||||||
|
$actualDistributionDir = $distributionUrlNameMain
|
||||||
|
}
|
||||||
|
|
||||||
|
# If not found, search for any directory with the Maven executable (for snapshots)
|
||||||
|
if (!$actualDistributionDir) {
|
||||||
|
Get-ChildItem -Path "$TMP_DOWNLOAD_DIR" -Directory | ForEach-Object {
|
||||||
|
$testPath = Join-Path $_.FullName "bin/$MVN_CMD"
|
||||||
|
if (Test-Path -Path $testPath -PathType Leaf) {
|
||||||
|
$actualDistributionDir = $_.Name
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!$actualDistributionDir) {
|
||||||
|
Write-Error "Could not find Maven distribution directory in extracted archive"
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Verbose "Found extracted Maven distribution directory: $actualDistributionDir"
|
||||||
|
Rename-Item -Path "$TMP_DOWNLOAD_DIR/$actualDistributionDir" -NewName $MAVEN_HOME_NAME | Out-Null
|
||||||
|
try {
|
||||||
|
Move-Item -Path "$TMP_DOWNLOAD_DIR/$MAVEN_HOME_NAME" -Destination $MAVEN_HOME_PARENT | Out-Null
|
||||||
|
} catch {
|
||||||
|
if (! (Test-Path -Path "$MAVEN_HOME" -PathType Container)) {
|
||||||
|
Write-Error "fail to move MAVEN_HOME"
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
try { Remove-Item $TMP_DOWNLOAD_DIR -Recurse -Force | Out-Null }
|
||||||
|
catch { Write-Warning "Cannot remove $TMP_DOWNLOAD_DIR" }
|
||||||
|
}
|
||||||
|
|
||||||
|
Write-Output "MVN_CMD=$MAVEN_HOME/bin/$MVN_CMD"
|
||||||
160
agent-backend/pom.xml
Normal file
160
agent-backend/pom.xml
Normal file
@@ -0,0 +1,160 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8"?>
|
||||||
|
<project xmlns="http://maven.apache.org/POM/4.0.0"
|
||||||
|
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||||
|
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
|
||||||
|
https://maven.apache.org/xsd/maven-4.0.0.xsd">
|
||||||
|
<modelVersion>4.0.0</modelVersion>
|
||||||
|
|
||||||
|
<groupId>de.galabau</groupId>
|
||||||
|
<artifactId>dateieingang-service</artifactId>
|
||||||
|
<version>1.0.0-SNAPSHOT</version>
|
||||||
|
<packaging>jar</packaging>
|
||||||
|
|
||||||
|
<properties>
|
||||||
|
<maven.compiler.release>25</maven.compiler.release>
|
||||||
|
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
|
||||||
|
<quarkus.platform.group-id>io.quarkus.platform</quarkus.platform.group-id>
|
||||||
|
<quarkus.platform.artifact-id>quarkus-bom</quarkus.platform.artifact-id>
|
||||||
|
<quarkus.platform.version>3.34.1</quarkus.platform.version>
|
||||||
|
</properties>
|
||||||
|
|
||||||
|
<dependencyManagement>
|
||||||
|
<dependencies>
|
||||||
|
<dependency>
|
||||||
|
<groupId>${quarkus.platform.group-id}</groupId>
|
||||||
|
<artifactId>${quarkus.platform.artifact-id}</artifactId>
|
||||||
|
<version>${quarkus.platform.version}</version>
|
||||||
|
<type>pom</type>
|
||||||
|
<scope>import</scope>
|
||||||
|
</dependency>
|
||||||
|
</dependencies>
|
||||||
|
</dependencyManagement>
|
||||||
|
|
||||||
|
<dependencies>
|
||||||
|
<!-- Quarkus Core -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-arc</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- REST Endpoint (eingehend von APEX Automation) -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-rest-jackson</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- REST Client (ausgehend zu ORDS) -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-rest-client-jackson</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Async: ManagedExecutor für fire-and-forget Pipeline -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-smallrye-context-propagation</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Fehlertoleranz (@Retry, @Timeout) -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-smallrye-fault-tolerance</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Observability -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-opentelemetry</artifactId>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- SFTP: SSHJ — moderner Java SFTP/SSH Client -->
|
||||||
|
<!-- Aktuelle Version: https://mvnrepository.com/artifact/com.hierynomus/sshj -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.hierynomus</groupId>
|
||||||
|
<artifactId>sshj</artifactId>
|
||||||
|
<version>0.38.0</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- OCI Object Storage SDK -->
|
||||||
|
<!-- Aktuelle Version: https://mvnrepository.com/artifact/com.oracle.oci.sdk/oci-java-sdk-objectstorage -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.oracle.oci.sdk</groupId>
|
||||||
|
<artifactId>oci-java-sdk-objectstorage</artifactId>
|
||||||
|
<version>3.44.0</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- ZIP-Verarbeitung -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.apache.commons</groupId>
|
||||||
|
<artifactId>commons-compress</artifactId>
|
||||||
|
<version>1.26.1</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
|
<!-- Test -->
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-junit5</artifactId>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.assertj</groupId>
|
||||||
|
<artifactId>assertj-core</artifactId>
|
||||||
|
<version>3.26.3</version>
|
||||||
|
<scope>test</scope>
|
||||||
|
</dependency>
|
||||||
|
</dependencies>
|
||||||
|
|
||||||
|
<build>
|
||||||
|
<plugins>
|
||||||
|
<plugin>
|
||||||
|
<groupId>${quarkus.platform.group-id}</groupId>
|
||||||
|
<artifactId>quarkus-maven-plugin</artifactId>
|
||||||
|
<version>${quarkus.platform.version}</version>
|
||||||
|
<extensions>true</extensions>
|
||||||
|
<executions>
|
||||||
|
<execution>
|
||||||
|
<goals>
|
||||||
|
<goal>build</goal>
|
||||||
|
<goal>generate-code</goal>
|
||||||
|
<goal>generate-code-tests</goal>
|
||||||
|
</goals>
|
||||||
|
</execution>
|
||||||
|
</executions>
|
||||||
|
</plugin>
|
||||||
|
<plugin>
|
||||||
|
<groupId>org.apache.maven.plugins</groupId>
|
||||||
|
<artifactId>maven-compiler-plugin</artifactId>
|
||||||
|
<version>3.13.0</version>
|
||||||
|
<!-- Keine explizite <release>-Angabe — wird von maven.compiler.release (25) gesteuert -->
|
||||||
|
</plugin>
|
||||||
|
<plugin>
|
||||||
|
<groupId>org.apache.maven.plugins</groupId>
|
||||||
|
<artifactId>maven-surefire-plugin</artifactId>
|
||||||
|
<version>3.2.5</version>
|
||||||
|
<configuration>
|
||||||
|
<systemPropertyVariables>
|
||||||
|
<java.util.logging.manager>org.jboss.logmanager.LogManager</java.util.logging.manager>
|
||||||
|
<maven.home>${maven.home}</maven.home>
|
||||||
|
</systemPropertyVariables>
|
||||||
|
</configuration>
|
||||||
|
</plugin>
|
||||||
|
</plugins>
|
||||||
|
</build>
|
||||||
|
|
||||||
|
<profiles>
|
||||||
|
<!-- Lokale Entwicklung mit Grafana/LGTM-Stack -->
|
||||||
|
<profile>
|
||||||
|
<id>grafana</id>
|
||||||
|
<activation>
|
||||||
|
<activeByDefault>false</activeByDefault>
|
||||||
|
</activation>
|
||||||
|
<dependencies>
|
||||||
|
<dependency>
|
||||||
|
<groupId>io.quarkus</groupId>
|
||||||
|
<artifactId>quarkus-observability-devservices-lgtm</artifactId>
|
||||||
|
<scope>provided</scope>
|
||||||
|
</dependency>
|
||||||
|
</dependencies>
|
||||||
|
</profile>
|
||||||
|
</profiles>
|
||||||
|
</project>
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
# Workflow: Automatischer Dateieingang via n8n → OCI Object Storage → DB
|
# Workflow: Automatischer Dateieingang — SFTP → OCI Object Storage → DB
|
||||||
|
|
||||||
**Stand:** 2026-04-07
|
**Stand:** 2026-04-08
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -9,10 +9,13 @@
|
|||||||
| System | Rolle |
|
| System | Rolle |
|
||||||
|---|---|
|
|---|---|
|
||||||
| **SFTP-Server** | Quelle — externer Lieferant legt ZIP-Dateien ab |
|
| **SFTP-Server** | Quelle — externer Lieferant legt ZIP-Dateien ab |
|
||||||
| **n8n** | Middleware — holt ZIP, entpackt, lädt Dateien + Marker in OCI hoch |
|
| **Dateieingang Service** | Middleware (Quarkus) — holt ZIP, entpackt, lädt Dateien + Marker in OCI hoch |
|
||||||
| **OCI Object Storage** | Zwischenspeicher — Eingangsordner, Zielordner nach Verarbeitung |
|
| **OCI Object Storage** | Zwischenspeicher — Eingangsordner, Zielordner nach Verarbeitung |
|
||||||
| **Oracle DB / APEX** | Verarbeitung — liest Dateien aus OCI, importiert Daten |
|
| **Oracle DB / APEX** | Verarbeitung — liest Dateien aus OCI, importiert Daten |
|
||||||
|
|
||||||
|
Details zum Dateieingang Service: `agent-backend/docs/Architecture.md`
|
||||||
|
Details zur DB-Verarbeitung: `database/docs/plan_pck_net_storage.md`
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Ablauf
|
## Ablauf
|
||||||
@@ -24,23 +27,25 @@
|
|||||||
│ 1. Unterordner in eingang/ mit Marker vorhanden? │
|
│ 1. Unterordner in eingang/ mit Marker vorhanden? │
|
||||||
│ → ja: Dateien verarbeiten (gleiche Logik wie Schritt 4-6) │
|
│ → ja: Dateien verarbeiten (gleiche Logik wie Schritt 4-6) │
|
||||||
│ │
|
│ │
|
||||||
│ 2. n8n Webhook auslösen (fire & forget) │
|
│ 2. Dateieingang Service aufrufen (fire & forget) │
|
||||||
|
│ HTTP POST /api/process-incoming (Header: X-Api-Key) │
|
||||||
└────────────────────────────┬────────────────────────────────────┘
|
└────────────────────────────┬────────────────────────────────────┘
|
||||||
│
|
│
|
||||||
▼
|
▼
|
||||||
┌─────────────────────────────────────────────────────────────────┐
|
┌─────────────────────────────────────────────────────────────────┐
|
||||||
│ n8n Workflow │
|
│ Dateieingang Service (Quarkus, läuft im Hintergrund) │
|
||||||
│ │
|
│ │
|
||||||
│ 3a. ZIP-Datei vom SFTP-Server herunterladen │
|
│ 3a. Neue *.zip-Dateien vom SFTP-Server auflisten │
|
||||||
│ 3b. ZIP entpacken │
|
│ 3b. ZIP herunterladen und entpacken │
|
||||||
│ 3c. Alle Dateien in OCI eingang/<zip-name>/ hochladen │
|
│ 3c. Alle Dateien in OCI eingang/<zip-name>/ hochladen │
|
||||||
│ (Unterordner aus der ZIP werden beibehalten) │
|
│ (Unterordner aus der ZIP werden beibehalten) │
|
||||||
│ → "Continue on Error" AUS: ein Fehler stoppt alles │
|
│ → Fehler stoppt Verarbeitung dieser ZIP │
|
||||||
│ 3d. Marker eingang/<zip-name>/_READY_FOR_DB_PROCESSING_ │
|
│ 3d. Marker eingang/<zip-name>/_READY_FOR_DB_PROCESSING_ │
|
||||||
│ hochladen │
|
│ hochladen │
|
||||||
│ 3e. ZIP auf SFTP umbenennen / verschieben │
|
│ 3e. ZIP auf SFTP umbenennen (.processed oder .error) │
|
||||||
│ → erst NACH erfolgreichem Marker-Upload │
|
│ → erst NACH erfolgreichem Marker-Upload │
|
||||||
│ 3f. ORDS-Endpunkt aufrufen │
|
│ 3f. ORDS-Endpunkt aufrufen (pck_auto_import.p_process_incoming)│
|
||||||
|
│ 3g. Lokale Arbeitsdateien löschen │
|
||||||
└────────────────────────────┬────────────────────────────────────┘
|
└────────────────────────────┬────────────────────────────────────┘
|
||||||
│
|
│
|
||||||
▼
|
▼
|
||||||
@@ -80,21 +85,19 @@ bucket/
|
|||||||
```
|
```
|
||||||
|
|
||||||
Der Marker bleibt solange erhalten bis **alle** Dateien des Unterordners
|
Der Marker bleibt solange erhalten bis **alle** Dateien des Unterordners
|
||||||
erfolgreich verarbeitet wurden. So werden fehlgeschlagene Dateien beim
|
erfolgreich verarbeitet wurden. Fehlgeschlagene Dateien werden beim nächsten Lauf erneut versucht.
|
||||||
nächsten Lauf erneut aufgegriffen.
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Fehlerfall-Verhalten
|
## Fehlerfall-Verhalten
|
||||||
|
|
||||||
**n8n: Upload einer Datei schlägt fehl**
|
**Service: Upload einer Datei schlägt fehl**
|
||||||
- Workflow stoppt sofort ("Continue on Error" ist aus)
|
- Verarbeitung dieser ZIP stoppt sofort
|
||||||
- Kein Marker wird geschrieben, ZIP bleibt unverändert auf SFTP
|
- Kein Marker wird geschrieben, ZIP auf SFTP wird zu `.error` umbenannt
|
||||||
- ORDS wird nicht aufgerufen
|
- ORDS wird nicht aufgerufen
|
||||||
- Beim nächsten Stundenlauf lädt n8n die ZIP erneut herunter
|
- Bereits hochgeladene Dateien werden beim nächsten Trigger überschrieben (OCI PUT idempotent)
|
||||||
- Bereits hochgeladene Dateien werden überschrieben (OCI PUT idempotent)
|
|
||||||
|
|
||||||
**n8n: ORDS-Aufruf schlägt fehl**
|
**Service: ORDS-Aufruf schlägt fehl**
|
||||||
- Marker liegt in `eingang/<zip-name>/`, Dateien sind vollständig hochgeladen
|
- Marker liegt in `eingang/<zip-name>/`, Dateien sind vollständig hochgeladen
|
||||||
- Beim nächsten Stundenlauf findet APEX Automation den Marker und verarbeitet
|
- Beim nächsten Stundenlauf findet APEX Automation den Marker und verarbeitet
|
||||||
|
|
||||||
@@ -115,9 +118,10 @@ nächsten Lauf erneut aufgegriffen.
|
|||||||
|
|
||||||
Der Zustand steckt im Dateisystem:
|
Der Zustand steckt im Dateisystem:
|
||||||
- Unterordner mit Marker = Batch bereit oder teilweise verarbeitet
|
- Unterordner mit Marker = Batch bereit oder teilweise verarbeitet
|
||||||
- Unterordner ohne Marker = unvollständiger n8n-Upload, wird ignoriert
|
- Unterordner ohne Marker = unvollständiger Upload, wird ignoriert
|
||||||
- Datei im Zielordner = erfolgreich verarbeitet
|
- Datei im Zielordner = erfolgreich verarbeitet
|
||||||
- Datei noch in `eingang/<zip-name>/` = noch ausstehend oder fehlgeschlagen
|
- Datei noch in `eingang/<zip-name>/` = noch ausstehend oder fehlgeschlagen
|
||||||
|
- ZIP auf SFTP mit `.error` = persistenter Fehler, manuelle Prüfung nötig
|
||||||
|
|
||||||
Fehlerdetails stehen in `lg_app_log`. Über `log_object_ref` ist jede Datei
|
Fehlerdetails stehen in `lg_app_log`. Über `log_object_ref` ist jede Datei
|
||||||
eindeutig einer ZIP zugeordnet. Kein Verhalten wird aus dem Log abgeleitet —
|
eindeutig einer ZIP zugeordnet. Kein Verhalten wird aus dem Log abgeleitet —
|
||||||
@@ -127,5 +131,5 @@ es dient ausschließlich dem Audit-Trail.
|
|||||||
|
|
||||||
## Zeitplan
|
## Zeitplan
|
||||||
|
|
||||||
APEX Automation läuft **1x pro Stunde**. n8n-Workflow wird dabei neu angestoßen —
|
APEX Automation läuft **1x pro Stunde**. Der Dateieingang Service wird dabei per
|
||||||
läuft also ebenfalls stündlich, zeitlich versetzt nach dem Automation-Start.
|
HTTP POST aufgerufen und läuft zeitlich versetzt nach dem Automation-Start.
|
||||||
|
|||||||
Reference in New Issue
Block a user