Jump to content

BRICKPICKER LEGO STOCK TRACKER **NOW WITH ALERTS**


Recommended Posts

Running this test overnight was honestly its first stress test and we saw that we will need to raise these servers from 2GB of memory to 6GB.  That is another $30/month for each server.
Because I am interested in this on a technical level: Why isn't 2GB enough? Assuming the servers only do the task of obtaining the information (pinging the sellers), this should not need that amount of memory, not by far - even if they would cache the results for some time. What am I missing?
Link to comment
Share on other sites

 

Because I am interested in this on a technical level: Why isn't 2GB enough? Assuming the servers only do the task of obtaining the information (pinging the sellers), this should not need that amount of memory, not by far - even if they would cache the results for some time. What am I missing?

 

 

This is being run on MS Azure cloud farm.  Its compiled C# code.  It is just a little more memory hungry.  When you are throwing approx 100+ connections out every few seconds, it just needs it.  This is how we did it, I am sure others of you that know python or other things that can run on linux/nginx can come up with something that is more efficient, but I don't know that.  There are also layers on monitoring going on the actual crawling that we are starting now looking for signs of it blowing up to alert us or kick off an automatic restart.   I am not the fastest in the world of releasing things, but I try to think things out for the future and where I think it may go.  I also don't mind spending a few extra $$ if it means that overall the experience is good. 

Link to comment
Share on other sites

Something to consider: sounds like you poll the target pages every second or two, but the page itself is on a 30 second refresh. So ~28*100 polls are wasted per person viewing. You could either slow down polling dramatically (~20s per) or use ajax to update the page and remove the global refresh. This would give people the new data instantly and not waste polls.

Link to comment
Share on other sites

Something to consider: sounds like you poll the target pages every second or two, but the page itself is on a 30 second refresh. So ~28*100 polls are wasted per person viewing. You could either slow down polling dramatically (~20s per) or use ajax to update the page and remove the global refresh. This would give people the new data instantly and not waste polls.

 

The 30 second page refresh has nothing to do with the grabbing of data.  The data is being updated in the background and updating the database.  I am just saving people from hitting reload/F5 by refreshing the page.

Link to comment
Share on other sites

Jeff I'm not sure which pages you are scraping. But I found the mobile version pages for multiple sources had their prices updated sometimes a full minute before their web versions. I'm not sure if they are doing some back end caching on their web versions or what's causing that. Just something to look into. 

Link to comment
Share on other sites

Jeff I'm not sure which pages you are scraping. But I found the mobile version pages for multiple sources had their prices updated sometimes a full minute before their web versions. I'm not sure if they are doing some back end caching on their web versions or what's causing that. Just something to look into. 

 

 

Thanks, i will take a look at that.  didn't even try to look at those

Link to comment
Share on other sites

This is being run on MS Azure cloud farm.  Its compiled C# code.  It is just a little more memory hungry.

Interesting. Do you have an idea where all that memory actually goes? What does it need to store that is that large? I would have imagined that the network is the bottleneck - both bandwidth and latency (and with that money) - but not memory on the "slaves".

 

Anyway - great that it exists in the first place.

Link to comment
Share on other sites

I agree that the "Higher than retail" boxes should be yellow, and the "lower than retail" should be blue. It's hard to see the exclamation points and tags at a glance, which is what we need to be able to do considering how many sets there are.

 

EDIT: I'd also like to see the Amazon column moved all the way to the right. The prices fluctuate, but there's always a unit available at Amazon, so it's not as useful if we're scanning the tracker.

 

EDIT #2: A few more ideas:

 

  • List them numerically in descending order. It'll make it easier for us to find the information we need, and it puts the exclusives closer to the top.
  • Alternate color or brightness row by row to make the info easier to read. I see that there's a slight variation between each row (white/light gray/white/light gray) but it's not enough to improve the eye's ability to read straight across.

Amazing tool, by the way. I really hope the guy who first coded a tracker gets credit and compensation.

Edited by johnwray
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.



×
×
  • Create New...