Skip to main content

Bring Snowflake data to Adaptive

Update in September 2025.


Since I wrote that article a couple of months ago, Workday Adaptive released 2 new ways to connect to Snowflake and other data lakes, such as Microsoft Fabric and Incorta.

1) Using the custom cloud data source -- which has become now one of the preferred methods!




2) Or even better, via the other preferred method: Pipelines.
Menu > Integrations > Pipelines:
Pipelines have several advantages. Basically it acts as a direct tunnel to your data lake, with a direct drill through capability to your data lake. And it acts as a 1. Datasource, 2. Loader and 3. Task!

You can find more information on the pipeline method & setup in this article here.


Old & Original Article: these methods are still working and valid but the AWS S3 is definitely not anymore the preferred method!


There are 3 ways to bring data from Snowflake into Adaptive:

  1. Use Adaptive's native integration module  >>> that's what I will cover in this post <<<
  2. Use third-party integration platforms such as Tray.ai, Workato, Boomi, etc.
  3. Host and Write your own script (in Python, Powershell, C#, you name it!)


This article focuses on point 1: Adaptive's integration module.

Within Adaptive's native integration module there are also 3 possibles methods! I will explain the pros and cons of each and will give more details on the easiest one! From within Adaptive, you can connect to Snowflake, by setting up one of these data sources: 

  1. JDBC 
  2. ETL (Scripted Data Source)
  3. CCDS (Custom Cloud Data Source)    >>> preferred method, explained below <<<


As of December 2024!

My preferred methodology is the CCDS using AWS S3, since Snowflake and S3 have super easy native integration within Snowflake. And Adaptive has a dedicated library to connect to S3.


Used-to-be Preferred method Adaptive to Snowflake integration: CCDS + AWS S3

Adaptive will fetch your Snowflake data from a file on AWS S3 (or an SFTP); S3 being preferred due ot the ease of integration.

I will provide below, links to the documentation on the Snowflake <> AWS S3 integration. So keep reading.

As of today: you can't make HTTPS web requests from Adaptive's CCDS to Snowflake! Adaptive’s library is limited for HTTPS requests and it can’t work with Snowflake and you can't add external JavaScripts libraries! But maybe it will come down the road?


Workflow

Your Snowflake/Data team writes a small script that will trigger the queries, generate a CSV output file and save it in AWS S3 (on a scheduled basis for example!). Then on Adaptive's end, a simple JavaScript will fetch the file and push it to your sheets!


Snowflake's documentation

You will find here the key links on how to setup things on Snowflake's side:


Other integrations within Adaptive, pros and cons

As mentioned, there are for now 3 ways to integrate with Snowflake. Here are the pros and cons of each (including the CCDS example from above).


Datasource Requires Agent? Requires Pentaho Kettle? Requires Javascripting? Comments
JDBC Yes 🔴 No 🟢 No 🟢 Requires install of data agent (more info below) + its drivers to manage the connection. It's a bit of an old way to do things. Sometimes the Agent service may stop running! Def. not my favorite!
ETL Yes 🔴 Yes 🔴 No 🟢 Requires install of data agent + Pentaho (more info below). Snowflake will deposit the data into the server where this agent runs. Pentaho transforms data for Adaptive. Definitely not recommended... Plus I heard rumors that this may retire!
CCDS No 🟢 No 🟢 Yes 🔴 Setup explained above. Minimal Javascripting in Adaptive to fetch the CSV file. Recommended and preferred method: simple and quick to setup.

Data Agent and Pentaho

The Data Agent is a component of Integration that runs on a server, extracting data from JDBC-compliant databases or custom data sources, and can also export data from Adaptive Planning. It requires a Windows server behind the customer's firewall, operating as a hosted service to manage access to on-premises apps and connections to Adaptive Planning in the cloud.

The Agent Service, a Windows Service launching a Java application, manages the data agents by polling the Adaptive Planning Gateway, which communicates with the Agent Service through the firewall. Multiple data agents can be hosted by a single Agent Service instance.

The installation includes a Data Agent Service Manager for setup and configuration, and optionally, Pentaho Kettle for ETL processes. If used, the Agent Service manages Kettle ETL job runners in Java virtual machines. Adaptive Planning provides plugins for integrating Kettle with the cloud. Pentaho components are only needed for Kettle scripts; otherwise, JDBC-compliant databases are used.

Data agents are managed through the agent UI in the cloud, accessible via a web browser. The UI allows customers to install, provision, suspend, resume, and upgrade data agents, as well as monitor their status and version.


More info from Workday's Adaptive documentation: 



Comments

Popular posts from this blog

Adaptive's API

To integrate data in Workday Adaptive Planning, customers typically use the integration module to connect with ERPs, SFTPs, AWS S3, Snowflake, SalesForce, Excel, and Google Sheets, etc.  You can also use third party platforms such as Tray.ai , Boomi , Matilion etc. which leverage Adaptive's APIs in a user friendly setup.  And lastly, you can write your very own scripts (C#, Python, Powershell...) to make these API calls.  Most  common Adaptive's API use cases Import Data from ERPs Actual summaries and transaction details (by supplier, employee...) Assets, amortization, depreciation Payroll details and taxes Import Sales data from CRMs or datalakes Pipeline, churn, opportunities and customers  Billings, bookings, revenue, ARR Import Headcount data from an HRIS Current headcount and terminations Hired, not started Wages and compensation plans Benefits and other demographics Import Other data: Exchange Rates Weather Meta data: currencies, attributes, dimensions suc...

What are Metric accounts and when to use them?

 I am often asked: "What are Metric accounts in Adaptive and what is their use case?" Before we jump into the weeds, you can create "global" metric accounts, called standard metric accounts  and within cube sheets.  To create a standard metric account, go to:  Menu > Modeling > Metric Accounts Adaptive will come with a couple of pre-built Metric accounts, such as Gross Margin % What are Metric Accounts? A metric account in Adaptive is a calculated account , meaning it carries a formula .  It does   not carry any data entry .  It can not be a rollup account,  meaning it can not have any "children accounts" rolling up to a parent. Metric accounts are typically used to calculate ratios , this is why they can only be either a number or a percent ! But... there is more subtility in them! Metric accounts are different from the other calculated accounts (custom, modeled, cube etc.) in the sense that their formula is also computed at the level an...

One (perfect) Headcount Planning Process in Workday Adaptive for FP&A

FP&A: The Perfect Headcount Planning Process in Workday Adaptive for non Platform customers: Streamlining integration, reconciliation and budgeting This article is for companies that are not on the Workday Platform  (HCM, FINS, Adaptive) and are missing out on the integrated cycle of hiring, planning, reconciliation, and system-wide alignment that Workday offers. I sometimes hear complaints (luckily not too much) about Workday HCM, Adaptive or FINS... but the truth is, no other tool on the market currently offers a complete solution that spans the entire hiring process—from budgeting and requesting to hiring, reconciling, and publishing plans across systems. Workday is the only one with this capability right now!  I’m not paid by Workday to write this… But after working with several HRIS, ATS, and ERP systems, I can confidently say that Workday stands out for its customizable workflows, overall ease of use, and seamless communication across all three systems -- assuming yo...