In this tutorial we are going to build price tracker application which will notify us for discounts.
Creating complex projects is the key to learn fast! If you are following me for a while, you already know I like complex stuff, so we are going to use RabbitMQ + Celery + BeautifulSoup + Django for creating this app.
Alright! Let's Start!
Celery is the best choice for doing background task processing in the Python/Django ecosystem. It has a simple and clear API, and it integrates beautifully with Django. So, we are using Celery to handle the time-consuming tasks by passing them to queue to be executed in the background and always keep the server ready to respond to new requests.
Celery requires a solution to send and receive messages; usually this comes in the form of a separate service called a message broker. We will be configuring celery to use the RabbitMQ messaging system, as it provides robust, stable performance and interacts well with celery.
We can install RabbitMQ through Ubuntu’s repositories by following command:
We will use beautifulsoup to crawl price and title of item in given URL. After data crawled we have to convert price to float and create new object in database.
views.py
fromurllib.requestimporturlopen,Requestfrombs4importBeautifulSoupdefcrawl_data(url):# User Agent is to prevent 403 Forbidden Error
req=Request(url,headers={'User-Agent':'Mozilla/5.0'})html=urlopen(req).read()bs=BeautifulSoup(html,'html.parser')title=bs.find('h1',id="itemTitle").get_text().replace("Details about","")price=bs.find('span',id="prcIsum").get_text()clean_price=float(price.strip().replace("US","").replace("$",""))return{'title':title,'last_price':clean_price}
strip() removes spaces at the beginning and at the end of the string and replace() method replaces a specified phrase with another specified phrase. So, we used these methods to get clean price and title of item.
Once data crawled successfully, it is time to create new object in database. We will use this function in form submission to crawl title and price of new item.
views.py
fromdjango.shortcutsimportrender,get_object_or_404,HttpResponseRedirectfrom.modelsimportItemfrom.formsimportAddNewItemFormdeftracker_view(request):items=Item.objects.order_by('-id')form=AddNewItemForm(request.POST)ifrequest.method=='POST':ifform.is_valid():url=form.cleaned_data.get('url')requested_price=form.cleaned_data.get('requested_price')# crawling the data
crawled_data=crawl_data(url)Item.objects.create(url=url,title=crawled_data['title'],requested_price=requested_price,last_price=crawled_data['last_price'],discount_price='No Discount Yet',)returnHttpResponseRedirect('')else:form=AddNewItemForm()context={'items':items,'form':form,}returnrender(request,'tracker.html',context)
Great! Now, we need to crawl the data for all objects continuously to be aware of discounts. If we do this without celery the server connection will timeout which means that a server is taking too long to reply to a data request and our application will crash.
Create tasks.py in your app and let's handle it with celery tasks.
tasks.py
importtimefromceleryimportshared_taskfrom.modelsimportVehiclefromtracker.viewsimportcrawl_data@shared_task# do something heavy
deftrack_for_discount():items=Item.objects.all()foriteminitems:# crawl item url
data=crawl_data(item.url)# check for discount
ifdata['last_price']<item.requested_price:print(f'Discount for {data["title"]}')# update discount field to notify user
item_discount=Item.objects.get(id=item.id)item_discount.discount_price=f'DISCOUNT! The price is {data["last_price"]}'item_discount.save()whileTrue:track_for_discount()time.sleep(15)
@shared_task will create the independent instance of the task for each app, making task reusable. This makes the @shared_task decorator useful for libraries and reusable apps, since they will not have access to the app of the user.
We are simply crawling data every 15 seconds and comparing last price with requested price. If last price is smaller than requested price then we are updating the discount price field.
What if price will increase again?
@shared_taskdeftrack_for_not_discount():items=Item.objects.all()foriteminitems:data=crawl_data(item.url)ifdata["last_price"]>item.requested_price:print(f'Discount finished for {data["title"]}')item_discount_finished=Item.objects.get(id=item.id)item_discount_finished.discount_price='No Discount Yet'item_discount_finished.save()
Great! Now, it will possible to track discounts properly. You can add one more function which will detect closer prices and notify user about it. For instance, if item price is 100$ and requested price is 97$. But let's keep it simple for now.
Well, you can improve the project by adding email functionality so Django will send email about discounts. Take a look How to Send Email in a Django App
Feel free to contribute the project :)
You can clone this project from my GitHub repository below