Astor Data Dome allows you to set up locations (emails and shared drive folders) that will be monitored - action will be triggered upon file arrival. Whenever such file arrives, it's matched against set of rules, predefined by user. There's also a possibility to create custom scheduler (or fire it manually) to extract (in any way, including even API or scrapping) data and process it further.
Each rule is connected to a pipeline node - it's crucial element, responsible for transformations. Within it, you can set up all data inputs (like rules mentioned before or even other pipelines), how the file should be read, how data should be transformed step-by-step and where the data should be passed further. This also includes reaching out to data stored in builtin or other databases.
Once data is transformed, it can be sent to 3 types of locations: -Email -Shared drive -Database Of course output details can be customized in UI, like email subject, email body, output filename etc.
Whole process is monitored in logs, where you can see how your files are being processed, with detailed informations.