Skip to content

neo4j-partners/hands-on-lab-neo4j-and-databricks-on-aws

Repository files navigation

hands-on-lab-neo4j-databricks on AWS

Neo4j is the leading graph database vendor. We’ve worked closely with Databricks engineering for years to combine graph data science and lakehouse architectures. Our products, AuraDB and AuraDS, are offered as managed services are available on AWS marketplace.

In this hands-on lab, you’ll get to learn about Neo4j, Databricks, and Amazon Bedrock. The lab is intended for data scientists and data engineers. We’ll walk through deploying Neo4j on AWS, connecting it to Databricks, and using Bedrock for generative AI workflows. Then we’ll get hands-on with a real-world dataset.

First, we’ll use generative AI to parse and load regulatory data into Neo4j with Databricks pipelines. Then we’ll show how to layer a chatbot powered by generative AI (via LangChain) over the knowledge graph. We’ll even use the new vector search and index functionality in Neo4j with Bedrock for semantic search. You’ll come out of this lab with enough knowledge to apply graph-powered generative AI to your own datasets.

We’re going to analyze the quarterly filings of asset managers with $100m+ assets under management (AUM). These filings are submitted to the Securities and Exchange Commission’s (SEC) EDGAR system. We’ll show how to load that data from an S3 bucket into Databricks, process it, and push it into Neo4j. Then we’ll explore the relationships of different asset managers and their holdings using the Neo4j Browser and Cypher query language.

If you’re in the capital markets space, you’ll be interested in applications such as new features for algorithmic trading, understanding tail risk, securities master data management, and more. If you’re outside of capital markets, this session will still be useful to learn about building machine learning pipelines with Neo4j, Databricks, and Amazon Bedrock.


Workshop Agenda

Venue

These workshops are organized onsite in an AWS or Databricks office.

Duration

1.5 hours

Prerequisites

  • A laptop with a web browser
  • Browser access to the AWS Console, Databricks Workspace, and port 7687 on a Neo4j deployment running on AWS
  • If your laptop has a firewall you can't control, consider bringing a personal device

Workshop Agenda (Duration-Based)

1. Arrival, Registration, Lunch & Networking — 45 min

  • Check in and grab lunch
  • Meet fellow attendees
  • Confirm Wi-Fi and development environment

2. Welcome & Introduction — 15 min

  • Overview of workshop goals and structure
  • What you’ll build today
  • Quick introduction to AWS Bedrock, Neo4j, and Databricks

3. Neo4j × Databricks Partnership Overview — 30 min

  • How graph and lakehouse architectures complement each other
  • Common integration patterns
  • Role of Agents (Preview) in data intelligence workflows

4. AWS × Databricks Overview — 15 min

  • How AWS services integrate with Databricks
  • Data flow patterns reproduced in the lab

Break — 10 min

  • Stretch and refresh
  • Verify your accounts and instances are ready

5. Live Demo: Databricks ⇄ Neo4j — 20 min

  • Walkthrough of the data import flow
  • Explore graph models and example Cypher queries

6. Hands-On Lab — 100 min

  • Set up Neo4j Aura and model your graph
  • Import data from Databricks Unity Catalog
  • Create a Neo4j Agent (Preview) using Text2Cypher and guided Q&A
  • Configure AWS Bedrock AgentCore and connect to Neo4j MCP Gateway

7. Open Chat / Q&A / Wrap-Up — 5 min

  • Recap major takeaways
  • Share next steps and learning resources

Happy Hour

  • Informal networking and discussion

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •