Storage rental rates analyzer.
Business: Storage Rental
Location: Lehi, Utah, United States. Company: Stack Storage
Client: Nache Nielson
Website: https://stackstorage.us
The client’s business was about Storage rental services all over the United State. For his business, he needs to keep track of over 10000(ten thousand) locations of the United States from 5 different websites of different companies.
Challenges:
Keeping track of storage rental rates from over 10000(ten thousand) locations of the United States from 5 different
websites of different companies on a daily basis is not an easy task and it takes lots of manpower and resources. So the
client hires us to build a web application from where he can just log in and see everyday records of storage rental
rates from any location all over the US.
And also he would be able to add new location links if he wants.
For this project, we need to build 5 different scrapers for 5 different websites which will continuously scrape rental
prices and details of that storage every day at once. Each of the websites is built with different technologies so for
every website new challenges come up very often. And there was a lot of work with data. Every day new 10000+ data is
coming up from the scrapers and the backend system has to manipulate that and sort that in a manner where the user can
browse it easily. It was a challenging part to do. But we faced the main challenge in the deployment phase because there
were 7 different scripts working together. Maintaining all of them was a challenging and also interesting part to do.
Last outcome:
After dealing with all the difficulties we succeed to deliver the perfect tool to our client that he needs from the very first. We build a web application so that the client can access it from anywhere using his credential. And he can add new location links. The system will automatically detect the website from the link. He can view any location and view the price record of particular size storage from that location. We built a graph system for the data so that the client had a clear view of price changes. And also he can change the graph settings from daily, monthly, and yearly.
How we did this:
When the client came to us with his idea he wasn't sure about the whole thing. We listened to the problem and tried to give him some different solutions. Then we made a proper MVP and started the system design and development process. At first, we build the scrapers for the websites which will go to a link and look for storage and scrap the price and details then save it to a database. Then we build a backend for handling those data for filtering, sorting, and manipulating with a graph model. And also build some APIs for login, adding new locations, editing them, and many more actions. After completing the backend we moved to develop the frontend part. We use graph view for data and list view for locations. There was a clean dashboard and profile page for editing profile details. We deploy the 7 scripts to 2 different servers. For frontend and backend, we use Heroku and for 5 scrapers we use VPS. In the end, we succeeded in solving our client's problem and delivering him the proper tool he needed.
Language & Technologies:
We have used Python as a base. Django for creating the APIs, Selenium for the bot, React and HTML, CSS, Bootstrap for the frontend, MongoDB for the database. We have deployed the Backend and frontend to Heroku. For deploying the scrapers we had to use 2 VPS servers.
Client Review
The Core Devs team was great to work with! Highly likely we will work with them again on additional projects in the future.