New! Sign up for our free email newsletter.
Reference Terms
from Wikipedia, the free encyclopedia

Web crawler

A web crawler (also known as a web spider or web robot) is a program or automated script which browses the World Wide Web in a methodical, automated manner. This process is called Web crawling or spidering. Many legitimate sites, in particular search engines, use spidering as a means of providing up-to-date data. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine, that will index the downloaded pages to provide fast searches. Crawlers can also be used for automating maintenance tasks on a Web site, such as checking links or validating HTML code. Also, crawlers can be used to gather specific types of information from Web pages, such as harvesting e-mail addresses (usually for spam).

Related Stories
 


Computers & Math News

February 27, 2026

Researchers have discovered new ways to shape quantum light, creating high-dimensional states that can carry much more information per photon. Using advanced tools like on-chip photonics and ultrafast light structuring, they’re pushing quantum ...
Quantum computers need special materials called topological superconductors—but they’ve been notoriously difficult to create. Researchers have now shown they can trigger this exotic state by subtly adjusting the mix of tellurium and selenium in ...
CU Boulder researchers have designed microscopic “racetracks” that trap and amplify light with exceptional efficiency. By using smooth curves inspired by highway engineering, they reduced energy loss and kept light circulating longer inside the ...
Scientists may have spotted a long-sought triplet superconductor — a material that can transmit both electricity and electron spin with zero resistance. That ability could dramatically stabilize quantum computers while slashing their energy use. ...
Researchers tested whether generative AI could handle complex medical datasets as well as human experts. In some cases, the AI matched or outperformed teams that had spent months building prediction models. By generating usable analytical code from ...
Qubits, the heart of quantum computers, can change performance in fractions of a second — but until now, scientists couldn’t see it happening. Researchers at NBI have built a real-time monitoring system that tracks these rapid fluctuations about ...
Scientists at the University of New Hampshire have unleashed artificial intelligence to dramatically speed up the hunt for next-generation magnetic materials. By building a massive, searchable database of 67,573 magnetic compounds — including 25 ...
Quantum key distribution promises ultra-secure communication by using the strange rules of quantum physics to detect eavesdroppers instantly. But even the most secure quantum link can falter if the transmitter and receiver aren’t perfectly ...
Scientists have developed a new way to read the hidden states of Majorana qubits, which store information in paired quantum modes that resist noise. The results confirm their protected nature and show millisecond scale coherence, bringing robust ...
For the first time, researchers have shown that self-assembled phosphorus chains can host genuinely one-dimensional electron behavior. Using advanced imaging and spectroscopy techniques, they separated the signals from chains aligned in different ...
Neuromorphic computers modeled after the human brain can now solve the complex equations behind physics simulations — something once thought possible only with energy-hungry supercomputers. The breakthrough could lead to powerful, low-energy ...
As data keeps exploding worldwide, scientists are racing to pack more information into smaller and smaller spaces — and a team at the University of Stuttgart may have just unlocked a powerful new trick. By slightly twisting ultra-thin layers of a ...

Latest Headlines

updated 12:56 pm ET