Our experience through Snowflake DATA CLOUD WORLD TOUR The World of Data, Apps and AI 2023 in Delhi

11 / Sep / 2023 by Dheeraj Gupta 0 comments

I had the opportunity to experience the Data Cloud World Tour, and it was all about collaborating with data in unimaginable ways. I joined the event with Suprakash Maity, Prashant Singhal, Sushant, and Vikramjeet along with leaders, to learn about the latest capabilities of the Data Cloud and to hear directly from our customers about their most exciting use cases. The event covered a wide range of topics, including the future of generative AI and LLMs, working with Apache Iceberg in Snowflake, and bringing more development to the data with Snowpark Container Services, among others.

As a participant, I was able to choose from various breakout sessions presented by fellow Snowflake customers, experts, and partners. I also had the opportunity to attend the opening keynote to hear about the newest innovations in the Data Cloud, including how Snowflake can power AI without trade-offs. Networking with local Snowflake customers, technology partners, and data leaders was another highlight, and it allowed me to explore how to drive even more value from my data. It was an event that opened up new possibilities for both my company and my career.

Registration and Keynote Session

The Registration took place between 9 a.m. and 10 a.m. and was mostly a digital process using a QR code. The keynote session started at 10:30 a.m., and I had the opportunity to discover the next wave of innovations in the Data Cloud during the event. These innovations included advancements in open table formats, generative AI and LLMs, flexible programmability with Snowpark, and more. It was fascinating to witness how the Data Cloud had no limits in bringing data, compute, and ecosystem together for AI/ML initiatives. I also had the chance to listen to stories from local customers who used the Data Cloud to eliminate data silos and securely put their data to work across their entire business. It was impressive to see how Snowflake continued to redefine what was possible with data.

Breakout Session 1: Winning the Race to the Future: Fast-tracking Data Modernization and Value Realization

During my involvement in the data and analytics modernization program, I encountered challenges in achieving our set goals for time-to-market and delivering value to our end customers. These challenges mainly stemmed from the complexity of our legacy systems. It became evident that the success of the program hinged on having a trusted advisor and partner who could provide not only technical expertise but also deep industry domain knowledge.

I had the opportunity to participate in an enriching session that shed light on the essential components of modernizing a data platform and making it capable of handling AI and data analytics workloads on Snowflake. It was a valuable experience that highlighted the importance of having the right guidance and expertise in such a transformative journey.

Breakout Session 2: Unleashing the Power of Large Language Models with Snowflake

During the event, I had the opportunity to delve into how Snowflake could assist me in harnessing the power of LLMs for my data needs, both at present and in the future. The session provided insights on building my own LLM-powered application using Snowflake and Streamlit, complete with live demonstrations illustrating how generative AI and LLMs can be used to drive natural language data tasks like explanation, exploration, recommendation, discussion, and question answering.

The talk also delved into Snowflake’s strategic moves, including its recent acquisitions of Streamlit and Applica. It highlighted some of the new AI products being developed in-house by Snowflake. Furthermore, the speakers explored the implications of this technology on various aspects such as app development, data storage, processing, and analytics. The session concluded with valuable insights on how Snowflake could serve as a guide in navigating the rapidly changing landscape of the future.

Breakout Session 3: Snowflake on Snowflake – SaaS Spend Optimization Using Snowpark and ML

In the past, many large-scale enterprises wasted millions of dollars annually on licenses for third-party SaaS apps that went unused. Some of these enterprises employed cost-conscious methods to identify and revoke these unused licenses, often relying on arbitrary criteria. Others opted to hire expensive consultants to perform essentially the same task. Given the economic climate, it became imperative to seek out optimal cost-cutting methods rather than settling for mediocre ones.

During a session I attended, Snowflake’s Data Science team for IT shared how they had revolutionized the traditional SaaS model. They had successfully transitioned from a license-based model to a consumption-based one. To achieve this, they leveraged Snowpark and Streamlit, which are Snowflake’s native machine-learning and web application capabilities. Their efforts resulted in the creation of Snowpatrol, an internal application within Snowflake. Snowpatrol played a pivotal role in quantifying user consumption and automating revoking unused SaaS licenses. Additionally, it collaborated with the procurement department to optimize cost avoidance and enhance the user experience when renewing contracts with key vendors. This innovative approach offered a more efficient and cost-effective solution for managing SaaS licenses.

Breakout Session 4: Best Practices for Snowflake’s Native Cost Optimization Capabilities

During the event, I learned how crucial it was for organizations, including mine, to effectively manage and optimize costs as we expanded our usage of Snowflake across various workloads and use cases. The session featured insights from Snowflake’s product management team, who discussed their approach to cost management within Snowflake, emphasizing three key pillars: visibility, control, and optimization.

Throughout the session, I had the opportunity to see and learn about the latest cost optimization features offered by Snowflake. These insights were valuable in helping organizations like mine strike the right balance between cost and value, ensuring the continued success of our Snowflake deployments across diverse scenarios.

OPEN FORUM – Big Data vs. Snowflake

I explored the limitations of Hadoop and traditional big data architectures and gained insights into how Snowflake could revolutionize how I approached data and analytics. Hadoop and conventional big data setups have their constraints, often grappling with issues related to scalability, complexity, and the need for specialized expertise. These limitations can hinder organizations from efficiently harnessing the potential of their data.

Snowflake, on the other hand, offers a transformative approach to data and analytics. It provides a cloud-based, fully managed platform that eliminates many of the challenges associated with traditional big data systems. Snowflake’s architecture allows for seamless scalability, simplified management, and accessibility for a broader range of users within an organization. This paradigm shift can empower businesses to make more informed decisions and derive greater value from their data assets.

In essence, Snowflake represents a promising evolution in the world of data and analytics, offering a more streamlined, scalable, and user-friendly alternative to traditional Hadoop and big data architectures.

The final Quotes

The event was a great outlook for the special case studies if you have any existing Snowflake applications running. We got an opportunity to discuss our application with the experts and get a complete report depicting the bottlenecks and optimized way of using the snowflake. Also, the event was a great spot for networking and meeting like-minded people.

Finally, Thanks to Data Cloud World Tour for organizing this wonderful event. Looking forward to attending and presenting at more such events in the future.

FOUND THIS USEFUL? SHARE IT

Leave a Reply

Your email address will not be published. Required fields are marked *