AfterShip charged us 26K USD and refused to refund
One day we were faced with a staggering bill from AfterShip, a company that provides an API for tracking shipment status.
Stories from ScrapeNinja founder: bootstrapping SaaS products, web scraping, and more
One day we were faced with a staggering bill from AfterShip, a company that provides an API for tracking shipment status.
I know how this desire to try out a new flashy npm package in a quick project can easily paralyze my will and the whole project's progress over exploring new docs and code. So in this writeup I would like to give appreciation to all the tools and techniques that have proven to be effective for me.
Experimenting with GPT summarization functionality sometimes greatly confuses a lot of people. What it can and what it cannot do?
Web scraping is a popular technique that allows developers to quickly and easily extract data from websites. It's especially useful for extracting real estate information, such as property listings and median home prices. In this blog post, we'll explore how to web scrape Zillow with low-code platform ScrapeNinja and JavaScript. Table of contents: * Why ScrapeNinja? * The approach * Prerequisites * Choosing scraping strategy * Switching proxy country of web scraper * Switching to heavie
There is a number of projects which allow website monitoring, but I needed a pretty custom one - I wanted to check Apple.com refurbished section for iphone 12 models and get push notification to my phone when it is there. I also wanted pretty custom alerts - not email but a real push alert to my phone. I decided to build it with awesome tools: Make.com, ScrapeNinja and ntfy - and it took me around 20 minutes to get it running! I decided to pack my experience into this tutorial. Step #1: The Ta
Puppeteer is an incredibly useful tool for automating web browsers. It allows to run headless (or non-headless) Chrome instances, automatically interacting with websites and pages in ways that would normally require manual input from a user or other scripts. In a lot of cases (particularly in web scraping tasks) it is required for HTTP requests to look like they originate from different IPs or networks than your server running Puppeteer – and this is where proxies come into play. In this blog po