Jump to content

Scripting - Page Monitors, In-Stock Trackers, etc.

Recommended Posts

1 hour ago, xadrian said:

I had to uninstall the app because I was getting email, desktop pop up, and an alert on my phone about it.

Do you WUPHF?

1ccfda103958bdc1736f0d30d8d30855.jpg

  • Like 3
  • Haha 1

Share this post


Link to post
Share on other sites

I searched this thread and didn't seem to find any discussion on this...

I feel like there are computerized bot programs snatching up sets as soon as they become available. A few programs became very popular for snatching up Nintendo Switches online using bots - practically instantly selling them out as soon as they were restocked. 

Has any one experiences similar issues, hunches, or have any insight into how this can be combated? Or specifically what bots are being employed?

Share this post


Link to post
Share on other sites

I don’t think there are bots so much as there are other issues at play. 
Namely, Amazon has no limits on LEGO as far as I can tell. Folks are buying everything they can we something goes in stock. Target has relaxed limits as well.

Many restocks are only one or two at a time. So you are racing against others for the sole item. And by the time you know about it, it could have been available to others for awhile. 
Years ago I had an auto-purchase script that I built myself to purchase — quite unfortunately — pet shops at RRP when they were retiring. It worked fine, but eventually Amazon found a way to break it. I haven’t bothered with it since. I’ve also dabbled with scripts that would alert you to web page changes. If set up to just poll a handful of pages fairly often (every minute for example) you can catch most items but you have to be ready when the alert goes off. 
The big in stock tracker sites are monitoring about once every ten minutes. When you get an alert from them, it’s a rat race!

Lastly, I have been up working late more often the past few weeks. You can definitely get sets a lot easier at 3AM than 3PM. Which tells me the auto buy bots are not running since they don’t sleep. 
So, set up some page monitors, have your credit card info pre-programmed and amazon buy it now set up, Fire up your favorite game or queue up some movies, and see what you can score over night! :)

  • Like 1

Share this post


Link to post
Share on other sites
1 minute ago, TheBrickClique said:

I don’t think there are bots so much as there are other issues at play. 
Namely, Amazon has no limits on LEGO as far as I can tell. Folks are buying everything they can we something goes in stock. Target has relaxed limits as well.

Many restocks are only one or two at a time. So you are racing against others for the sole item. And by the time you know about it, it could have been available to others for awhile. 
Years ago I had an auto-purchase script that I built myself to purchase — quite unfortunately — pet shops at RRP when they were retiring. It worked fine, but eventually Amazon found a way to break it. I haven’t bothered with it since. I’ve also dabbled with scripts that would alert you to web page changes. If set up to just poll a handful of pages fairly often (every minute for example) you can catch most items but you have to be ready when the alert goes off. 
The big in stock tracker sites are monitoring about once every ten minutes. When you get an alert from them, it’s a rat race!

Lastly, I have been up working late more often the past few weeks. You can definitely get sets a lot easier at 3AM than 3PM. Which tells me the auto buy bots are not running since they don’t sleep. 
So, set up some page monitors, have your credit card info pre-programmed and amazon buy it now set up, Fire up your favorite game or queue up some movies, and see what you can score over night! :)

Whats frustrates me is that my target red card always gives me problems during the checkout (even though I have enough $$$ for the debit card) and I've been cart jacked. No more messing around w/ it on time sensitive stuff.

Share this post


Link to post
Share on other sites

I have a series of Selenium scripts that are running that will check for inventory, and they run constantly over a VPN connection that I shuffle the IP on pretty regularly.  There is some other op sec that goes into it to avoid detection.  I can tell you down to the minute last night when the Assembly Squares came up, because they would be gone almost instantly --- and so I know I can't be the only one.  I've set some of them to auto-purchase and watch them the most for obvious reasons, but they are all set with a variable in the script that will not let them buy over a certain number so its a bit of a safeguard.  

  • Like 1

Share this post


Link to post
Share on other sites
On 6/17/2020 at 2:34 AM, Hardwired said:

I have a series of Selenium scripts that are running that will check for inventory, and they run constantly over a VPN connection that I shuffle the IP on pretty regularly.  There is some other op sec that goes into it to avoid detection.  I can tell you down to the minute last night when the Assembly Squares came up, because they would be gone almost instantly --- and so I know I can't be the only one.  I've set some of them to auto-purchase and watch them the most for obvious reasons, but they are all set with a variable in the script that will not let them buy over a certain number so its a bit of a safeguard.  

I've been working on a strategy similar to this but found the amount of IP and VPN switching to be quite cumbersome. I went a different route and started putting together an application to track sales and pricing data on Amazon. It should be able to pull multiple data points on every Lego set a day. I plan on using it track set popularity over time and understand how frequently sets go out of stock, use it to monitor prices and available quantities (from other sellers) of sets i'm interested in selling.

I don't have a lot of experience with web crawlers but once I finish up the Amazon application I was going to work on something like you mentioned above. 

I have a partially working email scrapper that grabs all my receipts and puts them in a database to make inventory intake much easier. I have a lot to finish up but I plan on automating some of the tedious portions of the job by the end of the year.

  • Like 1

Share this post


Link to post
Share on other sites
I've been working on a strategy similar to this but found the amount of IP and VPN switching to be quite cumbersome. I went a different route and started putting together an application to track sales and pricing data on Amazon. It should be able to pull multiple data points on every Lego set a day. I plan on using it track set popularity over time and understand how frequently sets go out of stock, use it to monitor prices and available quantities (from other sellers) of sets i'm interested in selling.

I don't have a lot of experience with web crawlers but once I finish up the Amazon application I was going to work on something like you mentioned above. 

I have a partially working email scrapper that grabs all my receipts and puts them in a database to make inventory intake much easier. I have a lot to finish up but I plan on automating some of the tedious portions of the job by the end of the year.

A service to monitor the ups and downs of prices for amazon products? Reminds me of the humps in a camel.
  • Like 2

Share this post


Link to post
Share on other sites
20 minutes ago, BrickLegacy said:


A service to monitor the ups and downs of prices for amazon products? Reminds me of the humps in a camel. 

Camel does a really poor job at monitoring 3rd party seller or non buy box offerings. Additionally it doesn't track sales rank (which is what I'm most interested in).

Share this post


Link to post
Share on other sites
Camel does a really poor job at monitoring 3rd party seller or non buy box offerings. Additionally it doesn't track sales rank (which is what I'm most interested in).

Just giving you a hard time... it’s definitely not a reseller-grade service.

Share this post


Link to post
Share on other sites
3 hours ago, landphieran said:

I've been working on a strategy similar to this but found the amount of IP and VPN switching to be quite cumbersome. I went a different route and started putting together an application to track sales and pricing data on Amazon. It should be able to pull multiple data points on every Lego set a day. I plan on using it track set popularity over time and understand how frequently sets go out of stock, use it to monitor prices and available quantities (from other sellers) of sets i'm interested in selling.

I don't have a lot of experience with web crawlers but once I finish up the Amazon application I was going to work on something like you mentioned above. 

I have a partially working email scrapper that grabs all my receipts and puts them in a database to make inventory intake much easier. I have a lot to finish up but I plan on automating some of the tedious portions of the job by the end of the year.

Why are you concerned about IP and VPNs if you are just tracking pricing and sales ?  I don't want anyone to catch me as a bot in there performing transactions (hence the op sec) and truth be told, I do all this in headless mode so the longest delay I have anywhere is that damn VPN/IP switching.   In and out of any listing in under 3 tenths of a second.  Switching IPs... tens of seconds.  Cursed technology. 

Share this post


Link to post
Share on other sites
Why are you concerned about IP and VPNs if you are just tracking pricing and sales ?  I don't want anyone to catch me as a bot in there performing transactions (hence the op sec) and truth be told, I do all this in headless mode so the longest delay I have anywhere is that damn VPN/IP switching.   In and out of any listing in under 3 tenths of a second.  Switching IPs... tens of seconds.  Cursed technology. 

Can I get one of those?

Share this post


Link to post
Share on other sites

A bot, or an army of bots.
Small party would suffice.
Will trade for functioning scraper.

Share this post


Link to post
Share on other sites
1 hour ago, donbee said:

A bot, or an army of bots.
Small party would suffice.
Will trade for functioning scraper.

What are you using a scraper for?

Share this post


Link to post
Share on other sites
What are you using a scraper for?

Not@donbee, but I’d like to create a scraper to look at LEGO Ideas daily to populate a spreadsheet. So I can do a little analysis whenever a new set breaks the 6,000 supporter mark, or achieves support, or is climbing the ladder quick.
  • Like 1

Share this post


Link to post
Share on other sites

https://ideas.lego.com/search/global_search/ideas?support_value=6000&support_value=10000&idea_phase=idea_gathering_support&query=&sort=most_recent

A URL that gives you all ideas with more than 6000 supporters, sort by newest first. 

I'm not sure what would count as rising quickly. What sort of analysis are you doing?

2 hours ago, Alpinemaps said:


Not@donbee, but I’d like to create a scraper to look at LEGO Ideas daily to populate a spreadsheet. So I can do a little analysis whenever a new set breaks the 6,000 supporter mark, or achieves support, or is climbing the ladder quick.

 

Share this post


Link to post
Share on other sites

Can anyone recommend any decent tutorials for writing a Python script that monitors prices and emails me when they drop below a certain point? I've managed to write a script that monitors the price of one item using the Beautiful Soup module but, that obviously isn't that helpful when you consider that I'm interested in 20+ items across multiple sites. From what I can gather Selenium seems to be a better package for this application?

All of my coding experience up to this point is using R to build models that predict species abundance and distribution so I'm very much a beginner with Python.

  • Like 1

Share this post


Link to post
Share on other sites

You could conceivably do it in either language.  The main appeal of Selenium for me was that it has everything in one package (with the possible exception of a database, maybe) and so it was faster to deploy for me.  The problem I think you are going to run into is the reporting function you are looking for.  If you are checking 20+ items across multiple sites your scope might be a little too broad.  

Share this post


Link to post
Share on other sites
https://ideas.lego.com/search/global_search/ideas?support_value=6000&support_value=10000&idea_phase=idea_gathering_support&query=&sort=most_recent
A URL that gives you all ideas with more than 6000 supporters, sort by newest first. 
I'm not sure what would count as rising quickly. What sort of analysis are you doing?
 

That’s the easy part.

What I want to scrape is a few things, daily, like the name, URL, author, and supporter count for an individual project that has at least 6,000 supporters. Anytime a project hits 6000 it needs to add it to the list. Compare the date a project was submitted to the number of votes that it has received. So If it hits 1000 supporters in say 5 days, add that to the list of “watch this one.”

I’ll have to check my notes but I think the issue was scrapping the search page and not the individual product page. But it’s also trying to figure out exactly which element needed to be scraped causing me issues.

I never got to the point of setting anything up in Python, as I was brute forcing it trying to figure out the elements first and never got past that stage.
  • Like 1

Share this post


Link to post
Share on other sites
8 minutes ago, Alpinemaps said:


That’s the easy part.

What I want to scrape is a few things, daily, like the name, URL, author, and supporter count for an individual project that has at least 6,000 supporters. Anytime a project hits 6000 it needs to add it to the list. Compare the date a project was submitted to the number of votes that it has received. So If it hits 1000 supporters in say 5 days, add that to the list of “watch this one.”

I’ll have to check my notes but I think the issue was scrapping the search page and not the individual product page. But it’s also trying to figure out exactly which element needed to be scraped causing me issues.

I never got to the point of setting anything up in Python, as I was brute forcing it trying to figure out the elements first and never got past that stage.

I'm just curious -- to what end?  Are you looking for trends in ideas that get support?  You could just as easily get a count of all projects over 6,000 votes on a daily basis and if that number changes then do a scrape.  It would save you some cycles in theory since it should not happen too often.  Unless you just want to track all projects with over 6000 votes, like you mentioned for velocity.  Hmm, interesting concept but I still am unclear on why you'd do that?

Share this post


Link to post
Share on other sites
1 hour ago, Hardwired said:

I'm just curious -- to what end?  Are you looking for trends in ideas that get support?  You could just as easily get a count of all projects over 6,000 votes on a daily basis and if that number changes then do a scrape.  It would save you some cycles in theory since it should not happen too often.  Unless you just want to track all projects with over 6000 votes, like you mentioned for velocity.  Hmm, interesting concept but I still am unclear on why you'd do that?

I'm a nerd that likes that kind of stuff?  lol

This is a good example of what I like to do with the data:

 

  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...