Upstash Kafka and Kafka Connectors have been deprecated and will be removed on March 11, 2025. Please refer to the deprecation notice for more information.

Kafka Connect is a tool for streaming data between Apache Kafka and other systems without writing a single line of code. Via Kafka Sink Connectors, you can export your data into any other storage. Via Kafka Source Connectors, you can pull data to your Kafka topics from other systems.

Kafka Connectors can be self hosted but it requires you to setup and maintain extra processes/machines. Upstash provides hosted versions of connectors for your Kafka cluster. This will remove the burden of maintaining an extra system and also improve performance, as it will be closer to your cluster.

Pricing

Connectors are free to use. We don’t charge anything extra for connectors other than per message pricing of Kafka topics. Check out Pricing for details on our per message pricing.

Get Started

We will create a MongoDB source connector as an example. You can find examples for all supported connectors on the left side bar under Connectors section.

Create a Kafka Cluster

If you do not have a Kafka cluster and/or topic already, follow these steps to create one.

Create a MongoDB Database

Let’s prepare our MongoDB Atlas Database. Go to MongoDB Atlas and register. Select Build Database and choose the Free Shared option for this example. Proceed with Create Cluster as the defaults should be fine. If this is your first time, you will see Security Quickstart screen.

Choose a username and password. You will need these later to put it in connection string to MongoDB.

You will be allowing Upstash to connect to your MongoDB database in the next screen. So be careful in this step.

Select Cloud Environment and then IP Access List. Enter following static Upstash IP addresses to IP Access List.

52.48.149.7
52.213.40.91
174.129.75.41
34.195.190.47
52.58.175.235
18.158.44.120
63.34.151.162
54.247.137.96
3.78.151.126
3.124.80.204
34.236.200.33
44.195.74.73

From here, you will be redirected to Database Deployments screen. Go to Connect and select Connect your application to find the MongoDB URI(connection string). Copy this string to use later when creating our Kafka Connector. Don’t forget to replace the password that you selected earlier for your MongoDB user.

Create the Connector

Head over to console.upstash.com and select your Kafka cluster. Go the Connectors tab, and create your first connector with New Connector button.

Then choose your connector as MongoDB Connector Source for this example.

Choose a connector name and enter MongoDB URI(connection string) that we prepared earlier in Config screen. Other configurations are optional. We will skip them for now.

Advanced screen is for any other configuration that selected Connector supports. At the top of this screen, you can find a link to related documentation. For this example, we can proceed with what we have and click Connect button directly.

Congratulations you have created your first source connector to Kafka. Note that no topics will be created until some data is available on the MongoDB.

See It In Action

With this setup, anything that you have introduced in your MongoDB will be available on your Kafka topic immediately.

Lets go to MongoDB and populate it with some data.

From main Database screen, choose Browse Collections , and then click Add My Own Data. Create your database in the next screen.

Select Insert Document on the right.

And lets put some data here.

Shortly, we should see a topic created in Upstash Console Kafka with DATABASE_NAME.COLLECTION_NAME in MongoDB database.

After selecting the topic, you can go to Messages section to see latest events as they are coming from Kafka.

Next

Check our list of available connectors and how to use them from following links:

If the connector that you need is not in this list, please add a request to our Road Map