||Nov 21, 2005 02:10 PM
||Sep 3, 2019 09:47 PM
||Sep 19, 2019 09:46 PM
DTN_Steve_S has contributed to 2094 posts out of 19263 total posts
(10.87%) in 5,053 days (0.41 posts per day).
20 Most recent posts:
This isn't something we can do immediately because it would potentially break other software that uses IQFeed on the same machine.
One of the primary features of IQFeed is that you are able to run multiple 3rd party trading apps side by side without them interfering with each other. This is largely only possible due to the per-connection based settings.
With that said, this is a fairly common request, especially from non-commercial software developers (who are frequently not using multiple apps). Unfortunately there won't be any features like this in 6.1 but we do have some ideas to make setting feed configuration easier in future versions.
The only recommendation I could give would be to first request daily (or weekly/monthly) data to find the start of data for the symbol. At that point you could craft an HIT request with an end datetime that reduces the overall amount of data you're having to process.
Also, as for updating the documentation to be more clear, the next version IQFeed has the dataDirection field described as follows:
- This determines which order the data is returned to you.
- '0' (newest to oldest) or '1' (oldest to newest).
- Note, this does NOT change the data that is returned. The ordering of the data is simply reversed if oldest to newest is selected
I'll look into something to address explaining a bit more about how the result sets are collected.
Confirming what Yair said here.
The [MaxDatapoints] should be thought of as a "No more than X datapoints" instead of "First X datapoints before/after..." from the API.
Also, the [DataDirection] parameter does not change the result set. It only changes the direction in which the result set is delivered.
As a result, The request HIT,AMD,60,20000101 000000,20190101 235959,3,,,0 is saying "give me all of the data from 20190101 235959 back to 20000101 000000 but no more than 3 datapoints."
Additionally, HIT,AMD,60,20000101 000000,20190101 235959,3,,,1 says give me the same 3 datapoints but deliver them to me oldest to newest.
My apologies for the delayed response here.
You've decoded the message correctly.
It would appear that this is a bug in the feed but I don't know how or why it occurred at this time. If you have more examples, it will help in tracking this down, but I don't believe this is a case of simply applying the valid time to the wrong field.
My apologies for the typo (I'm sure it's been that way for years). It should say "IQFeed Developer Support". I have fixed it.
Here is the direct link: http://forums.dtn.com/index.cfm?page=forum&forumID=9
Edited by DTN_Steve_S on Apr 25, 2019 at 08:50 AM
Backfill will be obtained using the requests in the HistorySocket examples. From that point, the data is kept up to date using the Level1Socket.
With no additional information, I'm not sure why the L1socket wouldn't have returned data for you. At the very least you should have gotten an initial snapshot of data. What symbol were you using? Did you get a not found message back instead?
Denis, the file I updated yesterday has the following header as the first line.
Is that consistent with what you are seeing (as I mentioned, this isn't quite a finished offering for us in terms of delivery so it's possible there is still issues with implementation)?
By the way, I also found this older thread on the same subject that explains the Expiration Date field is currently intentionally left blank since the same data is embedded in the symbol. I'm not sure if there is extra information that is useful to you in that thread but just in case you're interested, I wanted to make you aware of it.
We do not retain these lookup lists historically. The best we can offer is a list of all expired equity options that have traded in the system so you can build these lists on your end. We currently only update this as a manual process every couple months (on demand when customers ask). I have updated it for this morning's file.
The file is located on our FTP server here:
Edited by DTN_Steve_S on Apr 9, 2019 at 06:49 AM
This does not exist in the API.
Sorry I'm late to this thread.
I'm looking into this but I really don't see anything yet that would indicate this is anything other than general internet congestion causing problems. This could explain why restarting (and potentially getting assigned a different server farm) is having an effect.
Can you guys enable Request and Error logging levels and make sure you save off the log before restarting the feed (it overwrites on startup) next time you see this behavior? Feel free to post logs here in this thread or email them to dev support. I'd like to start tracking this to see if some sort of pattern becomes evident.
OK, another update. After searching through the server logs a bit, it appears that this error has been re-implemented for obviously bad symbols (stuff not allowed by our symbology).
However, with that said. There were only 13 requests total on March 14th that returned this error and none of them were for valid symbols.
I just realized something not-quite right here.
A long while back (at least a year or two), the history servers stopped sending the Invalid Symbol error code so there is no way you should have been getting that specific error. Are you certain this is the case? If so, please let me know your loginID so I can look at the server logs. Feel free to send it to me in a PM or an email to dev support.
No, the history servers were updated quite a while ago (at least a couple weeks before that). I'll get the release notes updated to indicate that has been completed.
I don't see anything that would indicate any of these issues are results of an error in the API (not 100% sure if that's what you were indicating or not but figured I'd mention it).
At this point, I would recommend turning on the logging in IQConnect to show you the Lookup Requests and Data for troubleshooting. You can do this via the example app.
I'm fairly certain you have something mixed up in your request/response processing code and hopefully the logfile will show you exactly what the feed is receiving from your app and sending in response.
The text vs XML issue has to be an error where you're either not actually sending the t or you are sending the t in the wrong field.
Hello. There are a couple things to consider here. Your S,SET PROTOCOL command isn't correct. You should be getting an error message back on that request. It only uses Major/Minor. The results you are getting are for a config request. You have to be sending this request somewhere in your code (nothing automatically does this for you in the API itself).
Other than that, your request works fine for me and returns headlines (only a few results copied here for brevity):
N,RTK,22120099247,:DKK:,20190311092941,Denmark Inflation Slows In February
N,RTK,22120096642,:EUR:,20190311091801,German BDI Cuts 2019 Growth Forecast To 1.2% From 1.5%
N,RTK,22120096422,:NOK:,20190311091351,Norway Inflation At 7-Month Low
N,RTK,22120093159,:EUR:,20190311085320,Estonia Trade Gap Narrows In January
N,RTK,22120087392,:USD:,20190311083820,U.S. Retail Sales Unexpectedly Rise 0.2% In January
N,RTK,22120086279,:USD:,20190311083100,U.S. Retail Sales Rise 0.2% In January, Ex-Auto Sales Climb...
N,RTK,22120062151,:EUR:,20190311065957,Ireland Construction Growth At 7-Month High
N,RTK,22120056465,:EUR:,20190311062747,Malta Jan Industrial Production +2.3% On Month Vs. -7.0% In...
Hello, concerning the trades in question. These are valid trades so it will be up to you to filter within your app. However we provide some help on filtering these by sending the trade conditions for each trade along with a classification of last-qualified or non-last-qualified. In your example, all of these trades are non-last-qualifed ODDLOT trades. Additionally, 4 of the 5 are labeled as CASH trades with the 5th being labeled as TTEXEMPT.
You will have to do some experimentation to figure out exactly which types trades you want to exclude from your analysis.
I am still looking into the volume discrepancy however, generally speaking, summation of tick volumes and comparing to total volumes is not a valid data verification technique because frequently these will not match up due to corrections, inserts, deletes, etc by the exchange.
Jeff, I believe this is a different issue.
The issues referred to in this thread have been resolved.
I took a look at your requests on the servers and I see no requests that weren't serviced in under a second today. In fact, the longest was .07s.
This implies that any requests you have that aren't getting completed are never making it to our servers which is an entirely different problem.
Can you send me a complete log file (likely will need to be zipped) that demonstrates the issue to developer support?
We continue to actively monitor this thread and the information you are providing is being utilized by our network, development and operations teams. aQuant, regarding your comment “I would like to see some involvement of IQFeed support in this… if they really want to solve it” – the answer is absolutely yes. We are here to support our customers and continually improve our products. Feedback from each of you is critical to our continued success.
There is a lot of action going on behind the scenes at DTN to further optimize our service and increase resiliency. This involves not only our staff, but teams in the middle and all the way through to the exchanges. In fact, this morning we received word that the CME is performing maintenance on fiber connections that we believe could be responsible sporadic dropped packets that may result in the reported stuck bids. Please continue to share your findings by sending them directly to developer support, and we will continue to work with you to resolve any issues you are experiencing.
The KBQueued value in the diagnostics app is just a display frontend for the clientstats messages on the Admin port from IQFeed via the API.
Turning this on/off and message format is documented here http://www.iqfeed.net/dev/api/docs/AdminviaTCPIP.cfm
Not sure if you were aware of that based on your post so wanted to make sure.
Edited by DTN_Steve_S on Feb 19, 2019 at 08:47 AM