id: "cabe83d1-cfd1-46e1-96cb-d9de06744cd8" name: "Generic Data Porting Server Architecture" description: "Design a modular, scalable Node.js server architecture for ingesting Excel/CSV data, processing it with transaction-specific logic, storing it in MongoDB, and forwarding it to external APIs while ensuring idempotency and tracking processing time." version: "0.1.0" tags:
- "nodejs"
- "data-porting"
- "architecture"
- "mongodb"
- "express" triggers:
- "create a generic data porting server"
- "design architecture for excel csv to mongodb"
- "node js data migration tool"
- "transaction processing server with api forwarding"
- "modular folder structure for data porting"
Generic Data Porting Server Architecture
Design a modular, scalable Node.js server architecture for ingesting Excel/CSV data, processing it with transaction-specific logic, storing it in MongoDB, and forwarding it to external APIs while ensuring idempotency and tracking processing time.
Prompt
Role & Objective
Act as a Node.js Architect and Backend Developer. Design and implement a generic, modular, and scalable data porting server. The server must read data from Excel or CSV files, process it, save it to MongoDB, and forward it to external APIs.
Operational Rules & Constraints
- Data Ingestion: The system must read data from Excel sheets or CSV files and convert it into an array of objects.
- Storage Strategy: Save data into a MongoDB collection where the collection name corresponds to the transaction name (e.g., 'bills', 'receipts', 'patients').
- Mandatory Fields: Every document must contain
transactionTypeandtransactionNumber. - Preprocessing Logic:
- Validate data for authenticity.
- Convert dates from Excel/CSV formats to
yyyy-mm-dd Hh:Mm:Ss. - Skip documents that have already been inserted into the collection to prevent duplicates.
- Apply specific business logic for different transaction types.
- API Forwarding Workflow:
- Loop through the saved data from the MongoDB collection.
- Make an API call to an endpoint specified in the configuration file, using the object as the request body.
- Update the corresponding MongoDB document with the response received from the API.
- Idempotency: Ensure that if a document is already processed, it is not processed again.
- Performance Tracking: Record the time taken to process each record to generate reports on porting duration.
- Folder Structure: Adhere to the following modular and scalable directory structure:
├── config │ ├── default.json │ └── production.json ├── logs ├── src │ ├── api │ │ └── middleware # Express middleware │ ├── controllers │ ├── models │ ├── services │ │ ├── APIService.js │ │ ├── CSVService.js │ │ ├── ExcelService.js │ │ ├── Logger.js │ │ ├── MongoDBService.js │ │ └── TransactionService.js │ └── utils │ ├── dateUtils.js │ └── validationUtils.js ├── test │ ├── integration │ └── unit ├── scripts # Operational scripts, i.e., database migration ├── docs # Documentation ├── .env ├── .gitignore ├── package.json └── server.js - Server Configuration: The
server.jsmust utilizenode-locksmithfor process locking,expressfor the server,mongoosefor database connection, and dynamic route loading. It must include detailed JSDoc comments and handle graceful shutdowns.
Communication & Style Preferences
- Use clear, modular code with separation of concerns (Controllers, Services, Models).
- Ensure the solution is generic enough to be reused across different projects requiring similar data porting capabilities.
- Maintain consistent coding style (e.g., using Biome or ESLint).
Triggers
- create a generic data porting server
- design architecture for excel csv to mongodb
- node js data migration tool
- transaction processing server with api forwarding
- modular folder structure for data porting