TorQ Release v3.1 – Kafka, Data Replay and Subscriber Cutoff

Jamie Grant data capture, kdb, kdb+, TorQ Leave a Comment

We are delighted to announce release v3.1 of TorQ, the latest instalment of our kdb+ framework. The first of our great new additions is kafka.q, which provides q language bindings for the ‘distributed streaming platform’ Apache Kafka, a real time messaging system with persistent storage in message logs. An application architecture built around Kafka could dispense with a tickerplant component, …

Jamie GrantTorQ Release v3.1 – Kafka, Data Replay and Subscriber Cutoff

Jobs at AquaQ – We’re Hiring!

Jonny Press News Leave a Comment

Would you like a job at a young, dynamic company working with cutting edge technologies?  We are hiring now!  We are looking for both experienced hires and graduates.  For graduate roles you don’t need to have a computing background, as all training will be provided. Take a look at our careers page and please send us an email with your CV and …

Jonny PressJobs at AquaQ – We’re Hiring!

AquaQ London Summer Event 2017

AquaQ Admin News 1 Comment

Following the success of our summer event last year, AquaQ is delighted to invite everyone in the kdb+ and wider community to our London Summer Event on Tuesday 6th June.  It will be held in The Drift Bar on Bishopsgate.  We will have some exciting (short) talks followed by plenty of food and drink. The format of the evening will be: …

AquaQ AdminAquaQ London Summer Event 2017

Integrating kdb+ with Apache Kafka

Jamie Grant kdb, kdb+ Leave a Comment

We’ve noticed a few clients and listbox/google group members asking questions about Apache Kafka recently, so we decided to take a closer look and add it to TorQ. Apache Kafka bills itself as a ‘distributed streaming platform’ – what that seems to mean is that it is a real time messaging system with persistent storage in message logs. The distributed …

Jamie GrantIntegrating kdb+ with Apache Kafka

TorQ – CME Data Processing Add-on

Aidan O'Gorman datablog, kdb, kdb+, TorQ 2 Comments

Our new addition to the TorQ framework presents a method for processing historical data in its native FIX format from the CME, building and maintaining an order book, and writing this data to disk in a variety of approaches suited to query efficiency or space efficiency. Our example data set is FX futures contracts of 11 major currency pairs, but …

Aidan O'GormanTorQ – CME Data Processing Add-on

Data Replay and Backtest in TorQ

Allan Moore data capture, kdb+, TorQ Leave a Comment

TorQ has a new utility for replaying historical data into real-time data processes, datareplay.q.  Replaying data is usually the first step towards allowing you to backtest.  datareplay.q builds a table of upd function calls like those generated by a tickerplant, but instead, using a historical database as the data source, making it simple to test new or existing real-time subscribers …

Allan MooreData Replay and Backtest in TorQ

Cloud kdb+

Jonathon McMurray data capture, datablog, kdb, kdb+ Leave a Comment

When setting up a kdb+ production system, you may not always have access to the hardware you need. In such instances, it can be helpful to turn to “the cloud”. There are a number of potential benefits to using a cloud system instead of a local server. For example, important server-related duties such as security, failover and data redundancy will …

Jonathon McMurrayCloud kdb+

New TorQ-FX Released

Jamie Grant data capture, kdb, kdb+, TorQ Leave a Comment

For anyone who has used TorQ, you will most likely be familiar with the Finance Starter Pack. The starter pack is an example data capture system that is based on randomly generated equity data. It was created with the purpose of showing how to set up an example TorQ installation and how different applications can be built and deployed on …

Jamie GrantNew TorQ-FX Released

kdb+ Database Setup Utilities

Aidan O'Gorman data capture, datablog, kdb, kdb+, TorQ 1 Comment

This blog post describes tools to help setup a new kdb+ database. The tools allow you to: calculate the expected memory requirements of the database check columns have been typed correctly in order to avoid sym file bloat Background The volume of data that a kdb+ process can store in memory is finite and must be estimated and considered when designing …

Aidan O'Gormankdb+ Database Setup Utilities

kdb+ Online Training Updated- Architecture

Jonny Press data capture, kdb, kdb+, TorQ, training Leave a Comment

We recorded our recent kdb+ Architecture Workshop in London, and we’ve added the video and slide deck as a bonus module on our kdb+ Bootcamp Online Training Course.  Above is a snapshot – a three hour in-depth workshop in 15 seconds.  Topics covered include: the basics of data capture extending kdb+ tick and alternative strategies scaling throughput versus latency removing …

Jonny Presskdb+ Online Training Updated- Architecture