AC Milan Niang: Is he a Milan legend we forgot?

Date:

Share post:

Alright, so today I’m gonna talk about this little side project I messed around with: “ac milan niang.” Don’t get any fancy ideas, it ain’t about football players or anything like that. It was just a fun little name I picked randomly for a thing I was building.

AC Milan Niang: Is he a Milan legend we forgot?

First off, I wanted to build a simple web scraper. You know, something to grab data off a website. I started by figuring out what site I wanted to pull info from. Decided on a site that listed, like, daily deals. Pretty basic stuff. I knew Python was the way to go for this kinda thing, so I fired up my IDE.

Got the basics set up. Installed `requests` and `BeautifulSoup4`. If you haven’t used those before, `requests` lets you grab the HTML of a webpage, and `BeautifulSoup` helps you parse it so you can actually find the stuff you want. That part was pretty smooth.

Now, the real work began: inspecting the webpage. Used my browser’s developer tools (right-click, “Inspect” usually) to see the HTML structure. This is where you gotta hunt around to figure out which HTML tags contain the data you’re after. Found the divs with the deal information and the links. It was a bit of a mess, honestly. The site wasn’t exactly designed for scraping.

Next, I wrote the actual scraping code. Used `*()` to fetch the webpage’s HTML, then passed it to `BeautifulSoup` to create a soup object. Then, I started using `*_all()` to grab all the divs containing the deals. Iterated through those results, pulling out the deal title, price, and URL. It took some trial and error to get the CSS selectors right. Kept getting empty lists at first! Figured out it was a subtle difference in the class names. Always pay attention to that!

The data was coming back all messy, with extra spaces and weird characters. So, I added some code to clean it up. Used `.strip()` to remove leading/trailing whitespace, and replaced some special characters with regular ones. Also made sure to handle any exceptions, like if a deal didn’t have a price listed.

AC Milan Niang: Is he a Milan legend we forgot?

Once I had the data extracted and cleaned, I needed to store it somewhere. At first, I just printed it to the console. But then I decided to save it to a CSV file. Used Python’s `csv` module for that. Opened a file in write mode, created a CSV writer object, and wrote the headers (deal title, price, URL) followed by the scraped data rows. Pretty straightforward.

Okay, scraping and saving to CSV was working. But I wanted to take it a step further. I wanted to automate this. So, I set up a cron job on my server to run the script every day at midnight. That way, I’d always have an up-to-date list of deals.

The final touch was to send myself an email with the daily deals. Used Python’s `smtplib` module to connect to my email server and send the email. Attached the CSV file to the email so I could easily browse the deals. This took a bit of fiddling to get the authentication right. Had to enable “less secure app access” in my Gmail settings (yeah, I know, not ideal, but it worked for this little project).

Here’s a quick summary of the steps:

  • Installed `requests` and `BeautifulSoup4`.
  • Inspected the target webpage.
  • Wrote the scraping code using `requests` and `BeautifulSoup`.
  • Cleaned and processed the scraped data.
  • Saved the data to a CSV file.
  • Set up a cron job to run the script daily.
  • Sent myself an email with the CSV attachment.

It wasn’t anything earth-shattering, but it was a fun little project. It showed me how easy it is to automate simple tasks with Python and web scraping. And, you know, it gave me something to do on a rainy weekend.

AC Milan Niang: Is he a Milan legend we forgot?

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Related articles

Setting up your bench press safety bars? Avoid common mistakes for maximum protection.

Alright, so today I wanted to share a bit about something I’ve really come to rely on in...

How to choose the perfect spi golf course (Follow these easy steps for your perfect golf day)

So, I’ve been meaning to share this little project I got myself into. I called it the “SPI...

Lance Ringler Twitter: Why follow him? (Get the latest college golf scoops and expert views)

Trying to Keep Up with College Golf – A Real Headache So, I got really into college golf a...

Discover Michael Jordan midrange percentage: How did he drain so many? (Stats here!)

Alright, so the other day I got this thought stuck in my head: Michael Jordan’s midrange percentage. We...