Issue
Have a few questions regarding SnowPark with Python.
Why do we need Snowpark when we already have Snowflake python connector(freely) that can use to connect to Python jupyter with Snowflake DW?
If we use snowpark and connect with Local jupyter file to run ML model. Is it use our local machine computing power or Snowflake computing power?If its our local machine computing power how can we use Snowflake computing power to run the ml model?
Solution
- Snowpark with Python allows you to treat a Snowflake table like a Spark DF. This means you can run pyspark code against Snowflake tables without the need to pull the data out of Snowflake, and the compute is Snowflake compute, not your local machine, which is fully elastic.
- As long as you are executing spark dataframe logic in python, the compute will be on the Snowflake side. If you pull that data back to your machine to execute other logic (pandas, for example), then Snowpark will be pulling the data back to your local machine and the compute will happen there as normal.
I recommend starting here to learn more:
https://docs.snowflake.com/en/developer-guide/snowpark/index.html
Answered By - Mike Walton
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.