Data sources are the Openprise objects that connect to your actual data for the purpose of pulling data into Openprise from your source systems. Your data can be in cloud systems such as Marketo, Salesforce or Eloqua. Or your data can be stored in cloud drives such as Google Drive, Box, Dropbox, or even FTP servers. We even support cloud databases such as Amazon Redshift or Amazon RDS MySQL databases.
To Create a Data Source
- Go to Data - Data Sources
- Click on "Add Data Source" and select Standard Connector. To learn more about Open Connector, please click HERE
- Data Source Name – required. This name will be used when naming the import data source for jobs.
- Data Source Administrators – required, only users with Administrator rights can use the Data Source.
- Source Technology and Data Format – Select the source system (can be a sandbox or production environment).
- Add Account Information – Add a valid user for your system.
-
Directory or Entity – Folder where the file resides for Cloud drives or Entity for Cloud Systems or Cloud Databases.
- If you are importing a static list from Marketo List Members, you will need to select the name of the list on the left and move it to the right using the Add button
- Click NEXT
Depending on the source technology and data format selected, the following options may be presented:
- Automatically run assessment report – select this option to have Openprise run a data assessment report, and select the frequency (Weekly, Monthly) from the drop-down list.
- Email report summary to data source administrators – select this option to email the summary automatically.
Data sources created from a file will provide some additional configuration options:
- Import one file at a time. This option is helpful if you plan to process multiple files through a data source, such as in a list import process.
- Move processed files after importing denotes that Openprise will automatically create a sub-folder in your directory for processed files. This option will automatically be checked if import one file at a time is selected.
- Time zone
- Text delimiter
- Import fields by name. This option imports field by column name instead of the default column position. Column Name (ex. Company) must be exact in every file imported to this data source.
- Skip import if the file is missing fields
- Import additional fields. This option allows the system to import additional fields if your goal is to solely keep the information of those additional fields rather than perform any transformations on them within the platform. For example: You set up your Data Source with 10 fields and then you import a new file with 15 fields where 5 of those fields are new. This checkbox gives you the option to import and carry those 5 extra fields throughout the jobs so you can export them with the rest of the fields. You will not be able to see the extra 5 fields in the jobs/data source.
- Truncate values that exceed a certain number of characters.
*NOTE: Data source importing only supports formats of mm/dd/yyyy or mm-dd-yyyy formats. If a date is imported as dd/mm/yy or dd-mm-yy, the dates would not convert properly. For example, 30-10-2020 would become 6-10-2022 because 30 is treated as 30 months which means 24 months (2 years) and 6 months and translates to adding 2 years to the year value resulting in 6-10-2022.
Once a data source has been created the data source tile will contain zero records. You must import the data into the data source.
To Import a Data Source
To import a data source, click on the card of your data source and select Import Now.
Notes: If you are connecting to a system that you are hosting in the cloud (for example, a MySQL database on the AWS cloud, an sFTP server, or AWS S3 bucket), please contact your Customer Success Manager to get IP addresses to whitelist. The whitelist IP addresses are needed to allow Openprise to connect to your servers.
Reference:
- Learn how to manage existing data sources by clicking HERE.
- For additional help on connecting to Redshift, click HERE.
- For additional help on connecting to an Amazon S3 bucket, click HERE.