Blogs Home » Entertainment » I arrived here to add another vote for EVE echoes

More from wang MMOruki

  • 0 comments, 0 likes
  • 0 comments, 0 likes
  • 0 comments, 0 likes

More in Politics

  • 0 comments, 2,456 views
  • 0 comments, 2,430 views
  • 0 comments, 2,387 views

Related Blogs

  • 0 comments, 0 likes
  • 0 comments, 0 likes
  • 0 comments, 0 likes

Archives

Social Share

I arrived here to add another vote for EVE echoes

Posted By wang MMOruki     Jan 9    

Body


I arrived here to add another vote for eve echoes isk adding lowest_sell and highest_buy in the CSV export. I'm working on my own tooling to pull your snapshot every 4 hours and I'd really like to just make the 1 API call with the full set of information as opposed to scratching all items where I would like to have greatest buy and lowest_sell. Volume could be great too!I suppose what I am really asking for is your photo to offer you a thorough snapshot to get a place (which right now is that the average of all areas, but later snapshots could be area specific).

Hey there! The csv ought to have the highest_buy and lowest_sell columns in it now. Hopefully this works for you. I'll possibly add in the volume at some point, but I haven't done a great deal of validation that it's actually horribly accurate, therefore I am a little more hesitant.My tooling is not anything that out of the ordinary. I have used the a variety of EE data sources to pull together a few tools to help in my industrial work.

I've really only used this to investigating potential home systems, but I guess it may be used for some advanced arbitrage evaluation in the future if market data can support per ITC/region data. Planetary Production data merged together with your market information (formerly manually input ) to find the best planets, systems, constellations, and areas for PP profits. The leading end for this project is really in Google Data Studio as it has fairly decent built-in GUI querying tools.

The clocks pull from the BigQuery edition of your marketplace data. I'm not doing anything overly interesting with this data yet, but I think it could be interesting eventually to begin to look at trends, early week drops, weekend spikes, etc.. I get your information into BigQuery with a scheduled Cloud Function which runs every 4 hours, grabs stats.csv, and appends it into my BigQuery market table. Then I have tooling in Google Sheets that allows me to pipe in the data together with parameters for date windows, specific items, etc..

There's a few reasons I don't, for now at least. One is the size of the raw data is significantly bigger, and even with the current data I am already handling ~500mb of requests every day, so that I really don't want to have to spend a bunch more to deal with the extra bandwidth.Another is the format of the raw data has shifted over time as I've changed my process. I'm comfortable supporting the current API, but I do not want to feel restricted to the current setup I have on the backend, or buy eve isk hazard violate the tools that others are generating.

Comments

0 comments