Creating an AWS Kafka Service for HFT Data PlatoBlockchain Data Intelligence. Vertical Search. Ai.

Creating an AWS Kafka Service for HFT Data

Creating an AWS Kafka Service for HFT Data PlatoBlockchain Data Intelligence. Vertical Search. Ai.

Step 4: Click Instances from the left side menu to see the list of your running instances. Click the running instance and record the Public IPv4 address for future use.

Connecting to EC2 Server and Kafka Setup

Step 1: Connect to EC2 Server via SSH. On Linux or Mac, this is done by opening a terminal and moving to the directory where the key pair file from earlier is stored. We then change the permissions of the key pair file and use it to SSH into our newly created EC2 Instance.

Note: CryptoFeedKafkaServer.pem is the name of the key pair file and should be replaced with the name of your key pair file. Furthermore, EC2-PUBLIC-IP is your EC2 server’s public IP address that you collected in the last step of starting an AWS Ubuntu server. If you are using Windows you can look at this reference of how to use Putty to SSH into an EC2 Server

Step 2: Prior to downloading, installing and configuring Kafka, we must make sure that necessary dependencies are meet for CryptoFeed and Kakfa.

The above commands updates the Ubuntu EC2 server, installs java for Kafka support, and then we use pip to add the python CryptoFeed and Aiokafka packages required for creating our CryptoFeed Kafka producers.

Note: CryptoFeed version is very important. CryptoFeed 2.0 will not work with future data storage steps.

Step 3: Setup of Kafka. Follow the commands below and read the comments to understand each step. The comments help to set the necessary Kafka configurations, so the Kafka brokers can be accessed outside the server by our consumers and that data is only retained for as long as necessary to manage storage space on the minimal EC2 server.

The message retention was set to 1 hour because there is less than 8 GB of storage on the EC2 instance and the rate of data collection causes the drive to be filled after only after a few hours. You can set the time messages are maintained in the logs longer if you allocate more storage to your instance.

Preparing CryptoFeed Script

With Kafka running, the final step on the server is to create and run a python script that will connect to our desired exchanges via WebSocket and collect cryptocurrency trade data.

Step 1: Write main.py scipt by first creating the empty python script via nano main.py and then you can add

Step 2: Then simply run the script via python3 ./main.py

If you wish you adjust the exchanges, coins, or trade data then check out CryptoFeeds repo for inspiration

Source: https://medium.com/@davidpedersen/creating-an-aws-kafka-service-for-hft-data-913e1e144ec0?source=rss——cryptocurrency-5

Time Stamp:

More from Medium