Replies

olddirtypipster
12 Aug 2015, 17:17

RE:

Spotware said:

Dear Trader,

Thank you for your detailed feedback. 
Asynchonious event handling is too difficult for most of our users, so we do not plan to change current message loop approach. We plan to skip old market depth changed events if new event is already in the queue. We already did it for OnTick event. We also plan to update market depth in RefreshData method.

If you would like to work with cServer without cAlgo, please have a look at Connect API. Please note that it doesn't provide market depth yet.

Thank you for your reply.

When you say: "We plan to skip old market depth changed events if new event is already in the queue"are you implying that you plan to implement a fixed circular buffer that will purge old events not taken from the buffer, to leave space for newer event? If this is the case, then the result would be gaps in the MarketData stream on the user end. If I understood you correctly, then I strongly advise against this.

The aim here is to have COMPLETE MarketDepth and not MarketDepth with gaps as a solution to un-optimized code, and as you know, a good trrading platform must adhere to the following:

  • real-time reliability (no missing data, and reflects the current market)
  • adaptability (can adapt to varied trading conditions; slow/fast data in this case)
  • extendibility, scalability and modularity (allows useds to interface with their financial data in a way that adheres to the first two conditions.)

If you discard (skip) old market depth events in favor of publishing newer ones, the end result would be that there'd be gaps in the market data during peak times. This will violate the first and second principle and everything goes from bad to worse. This is not a good solution in this case.

My suggestion is that you stream the data continuously, and allow the user full opportunity to take it all in with optimized code methodologies. As it is right now, cAlgo is throttling back.

 

You are correct in saying this: "Asynchonious event handling is too difficult for most of our users". WHAT USERS MUST BE AWARE OF however, is that if after pulling in any data their operation on that data takes more than a few milliseconds, they WILL suffer from degrading performance due to increasing lag. This is inevitable given cAlgo's current design. Eliminating data to remove the backlog lag does not solve the problem. It makes the problem worse.

The Achilles heel to Connect API is precisely that it does not support Market Depth. Much of the important information I require is hidden in the statistics buried inside the MarketDepth.

Until you are able to provide a MarketDepth data feed from Connect API, it is of no use to me.


@olddirtypipster

olddirtypipster
12 Aug 2015, 14:28

RE: RE:

Lastly, you could provide the user with a direct connection to the cServer MarketData stream, so that they may bypass the inefficiencies of cAlgo when acquiring market data. I am certain that this was a suggestion offered via a phone conversation before.

Cheers.

olddirtypipster said:

I appreciate your response.

In which case, my assessment of non-optimized coding for high speed streams is correct.

Because cAlgo and cTrader (I tested this as well) pushes incoming events in the processing queue to the user via MarketDepthOnUpdated one by one (sequentially), this would mean that if the rate of incoming data exceeds the rate at which the user pulls data in (which will be the case for low latency, HFT targeting liquidity providers during hyper market activity), then the processing queue gets more and more backlogged and the user suffers through this by receiving market data whose relevancy to real-time degrades over time.

Previously, I was placing the data as it came into a BlockingCollection collection, and then using a pipelined sequential foreach loop to pull the data out one by one.

What I discovered was that my BlockingCollection buffer became backlogged very rapidly as the single foreach loop could not pull and process fast enough to keep up with the rapid incoming data. The problem becomes even more severe if one expects to perform any heavy analysis on the data packet prior to iterating over the for loop to call for the next packet. Very soon you will find that you will be operating on data several minutes into the past as you would not be able to keep up.

There are possible solutions, however;

On the user end, a parallelized for loop will minimize the degrading performance.

On the cAlgo developer side, i suggest employing a parallelized for loop that dumps several marketdata orders to  its internal processing queue for pickup by the user.

Secondly, instead of employing conventional .NET event messaging, I think it would be best to use the .NET Reactive Extensions, convert your MarketDataEvent message into an Observable stream, and continuously stream this to a Subscribed user.

Spotware said:

Dear Trader,

cAlgo invokes all cBot's handlers in a single thread. All incoming events are added to the processing queue. cBot handles events one by one. If cBot takes too much time to handle an event, the queue will contain more and more events and that will cause handling of obsolete events. Please try to optimize code of your cBot, especially handler of the market depth changed event. If that doesn't help we can recommend you to move your calculations to a separate thread to minify processing time in the main thread.

 

 


@olddirtypipster

olddirtypipster
12 Aug 2015, 14:23

RE:

I appreciate your response.

In which case, my assessment of non-optimized coding for high speed streams is correct.

Because cAlgo and cTrader (I tested this as well) pushes incoming events in the processing queue to the user via MarketDepthOnUpdated one by one (sequentially), this would mean that if the rate of incoming data exceeds the rate at which the user pulls data in (which will be the case for low latency, HFT targeting liquidity providers during hyper market activity), then the processing queue gets more and more backlogged and the user suffers through this by receiving market data whose relevancy to real-time degrades over time.

Previously, I was placing the data as it came into a BlockingCollection<MarketData> collection, and then using a pipelined sequential foreach loop to pull the data out one by one.

What I discovered was that my BlockingCollection buffer became backlogged very rapidly as the single foreach loop could not pull and process fast enough to keep up with the rapid incoming data. The problem becomes even more severe if one expects to perform any heavy analysis on the data packet prior to iterating over the for loop to call for the next packet. Very soon you will find that you will be operating on data several minutes into the past as you would not be able to keep up.

There are possible solutions, however;

On the user end, a parallelized for loop will minimize the degrading performance.

On the cAlgo developer side, i suggest employing a parallelized for loop that dumps several marketdata orders to  its internal processing queue for pickup by the user.

Secondly, instead of employing conventional .NET event messaging, I think it would be best to use the .NET Reactive Extensions, convert your MarketDataEvent message into an Observable stream, and continuously stream this to a Subscribed user.

Spotware said:

Dear Trader,

cAlgo invokes all cBot's handlers in a single thread. All incoming events are added to the processing queue. cBot handles events one by one. If cBot takes too much time to handle an event, the queue will contain more and more events and that will cause handling of obsolete events. Please try to optimize code of your cBot, especially handler of the market depth changed event. If that doesn't help we can recommend you to move your calculations to a separate thread to minify processing time in the main thread.

 


@olddirtypipster

olddirtypipster
01 Aug 2015, 02:16 ( Updated at: 21 Dec 2023, 09:20 )

RE:

How far into the future should we have to wait before this feature of non-aggregated depth is implmented?

Spotware said:

Dear Trader,

The VWAP DoM displays a list of expected VWAP prices next to a list of adjustable volumes. To see the standard DoM with an overview of available liquidity for a particular currency pair please click the button shown in the attached screenshot.

Regarding the not aggregated market depth.  Currently we do not plan to provide this functionality. We will consider providing it in the future.

 


@olddirtypipster

olddirtypipster
19 Jul 2015, 20:43

RE: Non-Aggregated Depth of Market

MerlinBrasil said:

Dear Spotware,

I'm also EXTREMELY INTERESTED in having this functionality.

Thanks for your persistence in the face of very mixed messages, ODPipster, and thanks for that link.

Actual real-time view of the LPs book is critical and since you even offered to code it for them, I can not understand why Spotware has been so adament in their refusal to consider it.

Not having RT non-aggregated DOM is much like trading blindfolded. Having that data puts Retail Traders in a position only the Big Dogs currently enjoy.

PLEASE reconsider your position. ODPipster has been unnaturally patient. I can't imagine why an honest broker (ok, no laughing.....) would have any problem demonstrating this kind of transparency and transparency is what Spotware says they're all about, yes?

Thanks,

Merlin (SkypeID: merlinbrasil)
 

Quite frankly, the only reason why a provider would not make non-aggregated market feed unavailable is if the level II price stream is faked.

It would be very easy to identify illegitimate (synthesized) non-aggregated price feeds by comparing the individual LP prices for a given range to those produced from another source.

For instance, if the broker purports that two of their LP's are Barclays and BNP, then you'd simply archive a snippet of these prices for a specific time, then call up these two banks and ask for their prices for the same timeline.

If the prices match, then you'd know that the DOM is legit. If they do not match up, then you'd know that the DOM is spewing garbage data with zero correlation to the real market.

Because the individual LP prices are aggregated, there is no way of comparing them, and a broker could be hiding their dishonesty.

For example, it allows a broker the wiggle room to engineer internal spikes in their pricing feed, that do not necessarily occur in the direct market. It would be impossible for the average user to verify these spikes. There are several othr ways a broker could get away with internalized price manipulation by hiding behind the fact that they stream aggregated price feeds.


@olddirtypipster

olddirtypipster
13 Jul 2015, 20:38

In addition, we are still waiting to see non-aggregated prices on the DOM. When should we be seeing this very important feature?


@olddirtypipster

olddirtypipster
13 Jul 2015, 20:37

Why does this view not report sizes smaller than on lot? Surely, there should be several trades being made in this category.


@olddirtypipster

olddirtypipster
04 Jul 2015, 08:38

I am building one of these now. It records depth of market, and plays it back at the same rate as the market.

Contact me so we can discuss pricing at olddirtypipster@gmail.com


@olddirtypipster

olddirtypipster
04 Jul 2015, 08:37

DDE is obsolete. I am assuming that you want a feature to stream prices and depth of market to an excel sheet or database by connecting to the main ctrader app, in a way similar to MT4.

I can do this for you. Write me at olddirtypipster@gmail.com so we can talk about a reasonable price.


@olddirtypipster

olddirtypipster
30 Jun 2015, 20:22

RE:

Spotware said:

Dear Trader,

Since you asked again for access to non-aggregated Depth of Market, we will consider it.

This gives me a glimmer of hope. Here's hoping to see it in the next release!


@olddirtypipster

olddirtypipster
29 Jun 2015, 23:58

RE:

Spotware said:

Dear Trader,

Thank you for your suggestion. We will consider it. Additionally you can post your ideas/suggestions to http://vote.spotware.com/

I keep getting mixec messages. On one hand, one of you will say you refuse to do this. Then, on the other hand, someone else says we will consider it.

I would like to see this feature implemented in the next release. I have been speaking on this for over a year now.


@olddirtypipster

olddirtypipster
26 Jun 2015, 23:08

RE:

Spotware said:

I am inclined to suspect that the reason why you adamantly do not want to add this most basic of features, is because other than the top of book price, the other price/volumes in the DOM view are algorithmically synthesized by the cTrader engine, and are not in-fact being streamed from LPs.

We do not generate DoM. DoM is taken from LP directly.

I want to be able to differentiate between the LP  prices displayed.

I understand that you cannot divulge the identities of some.all of your LP's, but what you can do is provide a means to view all prices provided by the IP's instead of automatically aggregating them.


@olddirtypipster

olddirtypipster
05 Apr 2015, 20:41

very suspicious...


@olddirtypipster

olddirtypipster
02 Apr 2015, 20:46

This is what your platform should have to make it truly competitive.

http://www.youtube.com/watch?v=Kf-3FSe8-rY&t=6m31s

You are making a mistake by not considering it.


@olddirtypipster

olddirtypipster
02 Apr 2015, 18:44

I am inclined to suspect that the reason why you adamantly do not want to add this most basic of features, is because other than the top of book price, the other price/volumes in the DOM view are algorithmically synthesized by the cTrader engine, and are not in-fact being streamed from LPs. Other than the top of book price, these prices are meaningless noise.


@olddirtypipster

olddirtypipster
02 Apr 2015, 18:33

I figured as much.

 

Well... I am a skilled programmer, and can do this for you. Would you consider?


@olddirtypipster

olddirtypipster
01 Apr 2015, 12:45

RE:

comment was removed by moderator


@olddirtypipster

olddirtypipster
22 Feb 2015, 19:56

I suspect that one or more of these brokers are being dishonest (as usual).


@olddirtypipster

olddirtypipster
22 Feb 2015, 19:49

RE:

Yes. Tools such as this do exist, but are extremely expensive. Sorry.

usynn said:

Hi there, is it possible for us to be able to backtest/run a simulator which runs through tick data and simulates historical data as if it were happening in real time. There are many such simulators for MT4 where you can speed up and slow down the simulator. I usually use MT4 for simulator trading but I've created my own custom indicator with cAlgo and would like to sim trade in cTrader with historical data. Also, the ability to use tick-data, certain points or just every bar would be ideal.

If this concept doesn't exist within cTrader, would it be possible to be put down on the to-do's for the devs?

btw, when i mean sim trade i don't mean running a cBot but visually running through historical data as if were real-time but faster.

Many thanks

 


@olddirtypipster

olddirtypipster
09 Jan 2015, 20:36

testy...


@olddirtypipster