sportsfire
Posted on October 16, 2024
To scrape links from a webpage using PHP, you can use the file_get_contents function to fetch the HTML content and then parse it using the DOMDocument class. Here's a simple example: Site : SportsFire
<?php
// Function to scrape links from a given URL
function scrapeLinks($url) {
// Get the HTML content of the webpage
$html = file_get_contents($url);
// Create a new DOMDocument instance
$dom = new DOMDocument();
// Suppress errors due to malformed HTML
libxml_use_internal_errors(true);
// Load the HTML content
$dom->loadHTML($html);
// Clear the errors
libxml_clear_errors();
// Create an array to hold the links
$links = [];
// Get all <a> elements
$anchors = $dom->getElementsByTagName('a');
// Loop through the anchors and collect the href attributes
foreach ($anchors as $anchor) {
$href = $anchor->getAttribute('href');
// Add the link to the array if it's not empty
if (!empty($href)) {
$links[] = $href;
}
}
return $links;
}
// Example usage
$url = 'https://www.example.com'; // Change this to the URL you want to scrape
$links = scrapeLinks($url);
// Print the scraped links
foreach ($links as $link) {
echo $link . PHP_EOL;
}
?>
💖 💪 🙅 🚩
sportsfire
Posted on October 16, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.
Related
webdev Introducing Laravel Nightwatch: A New Age of Monitoring Your Laravel Apps ✨
November 27, 2024
webdev 🚀 My CRM Journey(Day-2): Admin Panel, jQuery Integration & Customer Management 🌟
November 26, 2024
programming How Lithe Makes Web Application Development in PHP Faster with Less Code
November 9, 2024