In the crowded arena of online shopping, speed is of the essence if you want to keep up with your competitors’ prices: If you aren’t quick enough, your rival could undercut you. A new report claims that Walmart is trying its hardest to automatically track Amazon’s prices, but it’s not having much luck.
Reuters reports that in January, Walmart engineers in charge of monitoring rivals’ online prices found the technology they were using to scrape Amazon.com for data wasn’t working anymore.
Walmart and many other retailers use bots — automated software intended to look like an authentic web surfer to the site being scanned — to search competitors’ offerings. Some bots are programmed to throw off the scent of bot-detectors, moving the cursor around a web page in the way a human might, rather than speeding through the page like a machine who knows exactly where to click.
But Amazon blocked Walmart’s bots from creeping on its online prices by putting up a digital curtain to hide its listings from users of a specialized web browser that Walmart reportedly used in its hunt for prices, insiders told Reuters.
According to those people familiar with the matter, the Arkansas-based retailer’s @WalmartLabs unit couldn’t get around the blockade for weeks, which meant it had to get Amazon’s data from a secondary source.
Walmart declined to comment to Reuters on the January maneuver but said that its company improves it technology regularly and uses a variety of tools to monitor prices.
Amazon said it’s aware that competitors use bots, but denied any “campaign” to “keep them from creeping.”
“Nothing has changed recently in how we manage bots on our site,” a spokeswoman told Reuters, adding that “we prioritize humans over bots as needed.”
While some online sites use CAPTCHA to trip up bots, those safeguards can be annoying to humans who are just trying to shop around, prompting retailers like Amazon to come up with new solutions. For example, one Amazon patent application [PDF] describes an encryption technology that would require bots to solve a complicated algorithm before gaining access to its Web pages, while humans wouldn’t have to do a thing.