AfterShip charged us 26K USD and refused to refund
One day we were faced with a staggering bill from AfterShip, a company that provides an API for tracking shipment status.
As a web developer and CTO with over 15 years of experience, I am passionate about building profitable small SaaS products and pursuing Go-to-Market strategy for them. My areas of expertise include high performance, networking technology and APIs, SRE, automation using Puppeteer.js, web scraping, and SQL databases.
One day we were faced with a staggering bill from AfterShip, a company that provides an API for tracking shipment status.
I know how this desire to try out a new flashy npm package in a quick project can easily paralyze my will and the whole project's progress over exploring new docs and code. So in this writeup I would like to give appreciation to all the tools and techniques that have proven to be effective for me.
Experimenting with GPT summarization functionality sometimes greatly confuses a lot of people. What it can and what it cannot do?
Web scraping is a popular technique that allows developers to quickly and easily extract data from websites. It's especially useful for extracting real estate information, such as property listings and median home prices. In this blog post, we'll explore how to web scrape Zillow with low-code platform ScrapeNinja and JavaScript.
There is a number of projects which allow website monitoring, but I needed a pretty custom one - I wanted to check Apple.com refurbished section for iphone 12 models and get push notification to my phone when it is there. I also wanted pretty custom alerts - not email
Puppeteer is an incredibly useful tool for automating web browsers. It allows to run headless (or non-headless) Chrome instances, automatically interacting with websites and pages in ways that would normally require manual input from a user or other scripts. In a lot of cases (particularly in web scraping tasks) it