What in the Python?
Adrian Brown
Posted on May 5, 2023
TLDR;
This tutorial explains how to use the Apollo SDK Python package to connect to an external database, transform data using Pandas, and perform Natural Language Processing on the data. With just a few lines of code, this powerful SDK provides a flexible interface for working with data and integrating with other services.
Prerequisites:
- Have a supabase instance spun up with data
- Create an API token with Apollo API
- A copy of your database connection string in URI format
Quickstart
Lets begin!
Setting Up a Connection to Supabase
First, let's install the Apollo SDK Python package using pip:
pip install apollo-sdk
Once the package is installed, we can use it to set up a connection to Supabase, a popular database management tool. Here's an example code snippet that shows how to connect to Supabase using the Apollo SDK:
# import the package
from apollo.client import Apollo
# sync data from your database instance (we support supabase at the current moment or postgresql via uri format)
Apollo.connect("postgres://username:password@hostname:port/database_name")
# If you want to test out operation on your external connection
Apollo.fetch_tables()
# Query the table in desc or asc order
Apollo.query("desc", "table", "column")
This code imports the Apollo module from the apollo.client
package and uses the Apollo.connect()
method to establish a connection to a Supabase database instance. Once the connection is established, we can use the Apollo.query()
method to run queries against the database and fetch data.
Transforming Data Using Pandas
Once we have fetched data from the external database, we can use Pandas to transform the data into a table. Here's an example code snippet that shows how to do this:
import pandas as pd
# the data returned by the Apollo.query() method
data = [...] # replace with the actual data
# convert the data into a Pandas DataFrame
df = pd.DataFrame(data)
# return the first 5 rows of the DataFrame
output = df.head()
# print the output to the console
print(output)
ex. output of dataframe for us using reddit data
This code uses the Pandas package to transform the data returned by the Apollo.query()
method into a table. We first import the pandas module and create a Pandas DataFrame from the data using the pd.DataFrame()
method. We then use the head()
method to return the first 5 rows of the DataFrame.
Sending Data to Apollo's AI model
Finally, we can use the Apollo SDK to send the data to a content API for further processing against an AI model. Here's an example code snippet that shows how to do this:
# import the necessary modules
from apollo.client import Apollo
# set up the connection to the content API
Apollo.use("apollo", token="YOUR_API_TOKEN_HERE")
# select the 'text' column from the DataFrame
texts = df['text']
# loop through each text in the 'text' column and send it to the content API
for text in texts:
# send the text to the content API to detect any threats
result = Apollo.detectText(text, "contains", "Threats")
# print the result to the console
print(result)
This code sets up the connection to a content API using the Apollo.use()
method and provides the necessary API token. We then select the text column from the Pandas DataFrame and loop through each text in the column. We use the Apollo.detectText()
method to send each text to the content API and detect any potential threats.
ex. output
status: 200,
data: {
actionsTriggered: [
{
id: 'ks93-as3',
name: 'Mute User'
},
{
id: 'f2-xa03d',
name: 'Escalate Content'
}
]
}
}
Conclusion
This tutorial explains how to use the Apollo SDK Python package to connect to an external database, transform data using Pandas, and utilize an AI model for NLP. With just a few lines of code, this powerful SDK provides a flexible interface for working with data and integrating with other services.
Useful links:
- Docs
- Collaborate with us on Github
- Join our discord
- Checkout the website
- Our newsletter
Posted on May 5, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.