Welcome to NexusFi: the best trading community on the planet, with over 150,000 members Sign Up Now for Free
Genuine reviews from real traders, not fake reviews from stealth vendors
Quality education from leading professional traders
We are a friendly, helpful, and positive community
We do not tolerate rude behavior, trolling, or vendors advertising in posts
We are here to help, just let us know what you need
You'll need to register in order to view the content of the threads and start contributing to our community. It's free for basic access, or support us by becoming an Elite Member -- see if you qualify for a discount below.
-- Big Mike, Site Administrator
(If you already have an account, login at the top of the page)
If you can program C/C++ then Go is really easy to pick up. I use it to prototype and some production things. Rust takes some time to get proficient at, but quite nice once you are.
Make sure you pull in the latest from that repo. I pushed a change of latest work which updates to latest schema.
My intention is to scalp futures, but I need a good data source for stocks. i.e I will trade Nasdaq futures if AAPL moves. It looks like IQ Feed is the way to go for complementary live data. I don't think I will use it for historical data as much because I plan to serialize every quote/trade I receive to do my own replay later and also unit tests/non regression tests, etc. I guess I don't have to start collocating right now because many traders trade this system manually, but it is good to have a roadmap in case I do eventually. In any way I will isolate the program from the API because I need to do it in order to do my own replays, so I could start using Rithmic or Denali and then switch if needed, but prefer to avoid it. It looks like I should discard IB unless I rely on IQFeed for live data.
Rithmic Diamond (their faster API) claims " With Diamond Cutter the latency is more deterministic and with less variability than the classic ticker and order plant. Publicly we claim that tick to trade is typically less than 250 microseconds. You are likely to find that it is much less and with high consistency." But has certain requirements from brokers and collocation costs as I need to have my own server. Still waiting from them to answer.
Notice that as long as it is not something crazy, I don't mind paying up to $1000 per month while developing the algo well (I hope it will take me less than 6 months). Just want to choose the right APIs/tools.
Sierra Chart Denali has Nasdaq TotalView for ~$25 extra per month. Nasdaq data comes from NY/NJ. You would only need to integrate with a single interface then. Otherwise, polygon.io or alpaca.markets have websocket APIs or you can stick with IQfeed.
If you go with Rithmic for order routing then ditch IQfeed for Rithmic data. If you use IQfeed for data, then it does not make much sense to get Rithmic’s expensive <250 micro routing, since IQfeed has 30-40ms latency and their protocol is text based rather than binary.
For co-location you can use beeksgroup.com. They have VPS and servers. Cheapest server is around $250 per month. They have a cross-connect to CME and Rithmic in their DC3 data center. Works for Sierra Chart Teton Order Routing as well.
Picking the right tools is a deceptively difficult and easily costly endeavor. Sierra Chart has possibly with worst UI and user friendliness and a website designed back in the 90s. It initially made me think the product was going to be abandoned in the near future. However, that assumption was wrong. It was just created by highly technical people. Denali connects to CME via a 10gb MDP link. That’s the absolute best connection CME offers. Denali normalizes the UDP messages into their DTC binary protocol then immediately broadcasts the message. Sierra Chart’s Teton Order Routing is the same design, but in reverse. It will be competitive with Rithmic’s 250 microsecond router, except Teton is free if you have Sierra Chart. Both have the same type of connection to CME in the same Aurora data center.
In summary, Sierra Chart is directly competitive with Rithmic and any difference in latency will be extremely minor. Sierra Chart will be around $80 per month with order routing and Denali with full market depth including CME licensing fees. Rithmic will be many times that. If cost isn’t a factor I personally would still go with Sierra Chart because of connecting via DTC protocol is lightweight and flexible. If I were you I’d ditch IQfeed and go with Sierra Chart or Rithmic and stick your application on a Beeks Group server / VPS in DC3. Low latency stock market access via Sierra Chart or polygon.io.
FWIW I built my platform with Rithmics RAPI / co-locate with them via TheOmne. I spend months doing a lot of latency optimization to get everything multi-threaded so I didn't block the main thread that read the data feed and submitted orders. I run everything in queues and have like 10 different threads all doing different stuff. This made a difference in a few nano-seconds I'm sure, but there are still plenty of days where the data feed lag alone is enough to kill me. I have seen plenty of bursts where the exchange stamp runs so far ahead of mine, there is nothing I can do but sit on the sidelines and wait.
I thought about going with the Diamond API, but I don't need faster execution, I need a faster data feed and I don't really have any benchmark to know how much faster their direct line is vs their ticker plant. If there is something out there faster than the standard rithmic ticker plant I'm all ears.
In the analytical world there is no such thing as art, there is only the science you know and the science you don't know. Characterizing the science you don't know as "art" is a fools game.
Are you using C++ or .NET? .NET you have the GC to deal with. It’s impossible to audit their Rithmic’s API code base without them sharing their source to really dig in. Could be in their networking stack or possibly on their infrastructure side. The fact they have a “Diamond” API option kind of points directly at them throttling or de-prioritizing you.
How far behind is your application? How many milliseconds? How many instruments are you listening to?
This is an example of what I see sometimes. This was on 10-07-22 at roughly 7:30 AM Chicago time. Each row represents an event that takes roughly 6 points to reach. For about 10 minutes my Applications Timestamp microseconds lagged the exchange by such a significant margin that I was like 10+ events behind. I don't think this is garbage collection, or throttling per se. It's just sometimes the market literally goes nuts, and using their public ticker plant isn't going to be fast enough. From what I understand their Diamond program bypasses the public ticker plant, so you would be in a priority lane to receive data, but I have no clue if it could handle something like this.
In the analytical world there is no such thing as art, there is only the science you know and the science you don't know. Characterizing the science you don't know as "art" is a fools game.