Altair Newsroom

Featured Articles

Build a Kafka-Enabled HTML5 Trading Analytics Dashboard in 20 Minutes

I spend most of my time on site at banks, hedge fund offices, and exchanges helping them build customized trading analytics displays. One of the most common questions I hear is, “What should our internal technology staff focus on in order to make us more responsive?”
To me, there’s only one right answer: Focus on the data fabric, not the GUI. The internal staff should strive to be as responsive as possible to the trading desks. Opportunities and threats can appear suddenly based on market events like index rebalancing or economic and political events, and even the latest tweets. Traders can’t wait; they must take action immediately in order to capitalize on trading opportunities and minimize the impact of fast-rising threats. Analytical displays may need to be designed and deployed during the day to be useful. Waiting on a technology team to build out analytical displays has a significant risk and cost. Time to market is critical.
A better way to work is for the desk personnel to concentrate on building their own analytics dashboards, while the technology team focuses on the data. The tech team’s job is to provide the raw figures and the calculated metrics to the desks in real time with minimal latency. When the data handling systems and the working partnerships are well designed, the trading desks can quickly build and deploy new analytics to match market events as needed, with no time consuming development process.
Apache Kafka has become a standard part of the infrastructures in many large and mid-sized trading firms, but many of them are not fully leveraging the Kafka platform as the backbone for real-time visual analytics. Some are even developing their own home-grown visual analytics tool sets. This is not only expensive and time consuming, it robs the trading desk of the responsiveness they need to operate as nimbly as possible.
Kafka provides an excellent, high performance analytical message bus as part of its standard functionality. The trick is harnessing it efficiently so traders can make good, insightful decisions based on massive amounts of fast-changing, real-time data.
There are two persistent myths that I continue to hear: The first is the idea that building out a real-time visual analytics system requires huge amounts of development and is a high risk enterprise. The second is that, in order to generate output displays, you first have to create stale database versions of the data, or continuously poll for the latest data. However, neither of these is true. There are platforms available that enable trading, risk, and compliance teams to take full advantage of Kafka’s inherent real-time capabilities without writing a single line of code.
As an example, I built a fully interactive analytical dashboard using Kafka as my data source (for real-time streams as well as historical time series data) using only simple drag-and-drop techniques. You can see how I did it here:

As you can see, it took about 20 minutes do it from start to finish. I utilized Kafka’s core capabilities along with the central schema repository and Avro format messages. Panopticon provides the trading analytics UI, control, and filtering, and connects natively to Kafka data sources (along with many others, including Kx kdb+, OneTick, AMPS, StreamBase, and Vertica.)
The key thing to remember here is that implementing Kafka properly and using it to its full effect is not a job for a big development team. What you really need are smaller teams of people who truly and fully understand what they need to see, what they need to calculate, and how their data is organized. Building out the visual analytics platform is the easy part, and to be most effective should be in the hands of end users, typically the e-trading team members.
We’ll be happy to go through this with you one-on-one if you like; just contact us to set up a time and day that works for you.