How lasers and atoms could change the future of computation

How lasers and atoms could change the future of computation

When you step into the quiet, temperature-controlled room at the National Quantum Computing Centre (NQCC) in Harwell, Oxfordshire, the first thing you notice are the three great black boxes, each containing a prototype quantum computer.  

They are clad with shutters, partly to protect your eyes from powerful laser light but also to prevent heat or vibrations – even pressure waves from someone walking in – from interfering with this new and sensitive kind of computer, one that uses individual atoms to explore realms beyond the reach of conventional, ‘classical’ computers in your phone, home or office. 

NQCC Laboratories and at the National Quantum Computing Centre (NQCC) at the Harwell Campus, Didcot, Oxfordshire. Image source: NQCC

Under development there for the past year, the trapped-atom quantum computer remains only a promise: a machine that computes not just with logic, but with the shimmering probabilities of quantum reality. It is one of a dozen different kinds of quantum computer being studied at the NQCC. 

Officially opened in October 2024, with an investment approaching £100 million, the NQCC’s 4,000-square-metre facility is Britain’s answer to a global challenge: to gather competing quantum technologies under one roof and see which are most likely to prosper.  

Walk its corridors and you find a variety of approaches—superconducting circuits cooled to millikelvin temperatures with the help of theatrical chandeliers and chirping pumps made nearby at Oxford Instruments; trapped ions suspended in electromagnetic fields; photonic processors that compute with light; others  that rely on silicon chips; and, glowing softly within black shutters, the neutral-atom arrays.  

Indeed, the many different approaches reflect how quantum computing is in its infancy, explains Luke Fernley, a researcher at the National Quantum Computing Centre. 

At their heart, quantum computers exploit the strange properties of quantum mechanics, which was developed at the start of the 20th century to describe nature at the smallest scales. “Neutral-atom-based quantum computers control the positions and properties of individual atoms,” he said. “This exquisite control allows their strange quantum mechanical properties to be exploited to perform computation.” 

The theory, which is deeply counterintuitive, also says that the fate of two particles can be linked so quantum computers can do simultaneous calculations, rather than one at a time. “Think of each atom as a switch that can be on, off, or a mixture of both on and off,” he said. “These atoms can interact with each other, causing an inextricable link known as “entanglement”. This link couples the states of many atoms at once, facilitating simultaneous calculations.”  

NQCC Laboratories and at the National Quantum Computing Centre (NQCC) at the Harwell Campus, Didcot, Oxfordshire. Image source: NQCC

Though he started out at Durham University studying molecular quantum computers, which offer richer (but currently slower) possibilities, he believes atoms remain a fundamental gateway technology to quantum machines which all at their heart manipulate qubits, the quantum version of the 0s and 1s in an ordinary computer, which can represent both a 1 and a 0 at the same time.  

Entanglement means that qubits in a superposition can be correlated with each other, enabling quantum computers to tackle difficult problems that are intractable to classical machines. Unlike the bits in a classical computer, which are rigidly on or off, qubits can occupy many states simultaneously to explore a vast landscape of possibilities at the same time.  

A quantum algorithm, or program, begins life much like a classical one—as a set of logical steps—but must be reimagined in the language of quantum mechanics. Each operation is translated into an atomic recipe that dictates which qubits to entangle and how to measure the outcome. On a neutral-atom quantum computer, the computation unfolds as a carefully timed dance of photons and atoms as the quantum algorithm is translated from mathematics into matter. 

By using finely tuned lasers to trap and control individual atoms, Luke Fernley and his colleagues aim to make them work together as a testbed for simulating quantum phenomena—for example, modelling molecular interactions to aid drug design. Next spring, they plan to use entangled pairs of atoms to implement quantum logic gates built from light and matter. 

A central challenge lies in readout: although quantum computers can explore vast numbers of possibilities simultaneously, this parallelism cannot be directly observed. When the algorithm concludes, the delicate quantum superposition ‘collapses’ to give one answer. Useful results emerge only by running the algorithms many times to build up reliable statistics. 

Atom wrangling 

To control atoms takes light. Throw open the black shutters and inside there is a banquet-sized optical table: thick metal drilled with a lattice of holes for mounting all kinds of bits and pieces to manipulate laser light—from collimators to lenses and prisms that fold and shape beams, to acousto-optic and electro-optic modulators that flick beams on and off or change their frequency in nanoseconds—and an orchestra of mirrors mounted on fine-threaded actuators. 

The business end, where laser light does its work, is an 8cm by 2cm glass-fronted  vacuum cell. Inside, the NQCC team anticipate trapping hundreds of caesium and rubidium atoms at a time in a neat array by using  lasers, which can herd atoms by bombarding them with photons, light particles.  

At first the team uses lasers and magnetic fields to herd millions of atoms, even tens of millions of atoms into a ball a few millimetres in diameter, called a magneto-optical-trap. Because molecular and atomic motions are tantamount to heat, the same confining lasers that create this blob also serve to cool the atoms, reaching a few hundred millionths of a degree above absolute zero – colder than outer space.  

At this temperature, atoms move at a fraction of the speed they do at room temperature. With the help of extra cooling steps, involving precise tweaking of magnetic fields, laser power and polarisation, the atoms can reach a few millionths of degrees above absolute zero where they can use another set of tightly focused lasers like microscopic tweezers to position them. These tweezing lasers form patterns- lines, grids, even honeycombs- “where an array of several single atoms is held by these optical tweezers as gently as eggs in an egg box,” he explained. 

When they want to entangle a pair of atoms, they use a trick called a Rydberg blockade: nearby atoms are briefly brought together and tickled with yet more lasers to form high-energy “Rydberg” states, where atoms can “feel” each other, a little like atomic magnets, so that changing one automatically influences the other. “One atom’s excitation prevents the other’s, so the qubits states become entangled together,” he said. “This entanglement opens the pathway to quantum parallelism, the mechanism to help solve puzzles that normal computers can’t.” 

These fleeting interactions – performed over thousandths to millionths of a second – form the logic gates to perform calculations using a fragile but powerful web of atomic correlations, sustained only if the machine is exquisitely isolated from the noisy outside world. The entire apparatus—from the vibration-damped optical table to the thicket of mirrors and modulators—is there to preserve this delicate quantum choreography.  

NQCC Laboratories and at the National Quantum Computing Centre (NQCC) at the Harwell Campus, Didcot, Oxfordshire. Image source: NQCC

 Cameras and single-photon detectors peer at the atoms through viewports, recording the faint fluorescence emitted when an atom changes state, when an electron shifts within it to a lower state. This glow reveals whether the atom represents a 0, a 1.   At the end, a sensitive camera takes a picture: glowing dots reveal 1s, dark ones 0s. The pattern is the answer.  

The lasers must remain extraordinarily stable for days; photon detection—the only way to read out results— has little tolerance for error when you have to pick up single particles of light from individual atoms; vibration or electromagnetic interference can ruin hours of careful alignment; and it takes time, of the order of a thousandths of a second, to make atoms have close encounters. 

While a dozen quantum computers are being tested at the NQCC, nearby on the Harwell campus two commercial companies are also investigating  atomic quantum computers: SQALE is the Scalable Quantum Atomic Lattice computing tEstbed, a neutral-atom quantum computing testbed of 16 by 16 arrays of atoms developed by the company Infleqtion, and another is being developed by QuEra Computing. Other teams worldwide are taking a similar approach, with one in Boston, USA, recently describing a 3000 qubit computer, and another in Pasadena unveiling an array of 6,100 atomic qubits.  

Simulation and chemistry 

 Quantum processors—including neutral-atom and molecular arrays- can simulate chemistry in unprecedented detail. These quantum simulations can mimic molecules more faithfully than classical machines. For drug discovery or materials design, this could mean testing how a compound binds, folds or reacts before any synthesis in the lab. The same tools could help uncover new superconductors, catalysts or battery materials tuned for efficiency and sustainability.   

For a single atomic species experiment, such as an atom of rubidium or caesium, around five to seven different lasers are required. These can produce pairs of entangled atoms and the resulting quantum processors could tackle problems that stump classical methods: optimising supply chains, traffic systems, or even the training of AI.  

Reading out and analysing the results of the quantum processor is done with classical computers. In effect it is a hybrid computer, where quantum co-processors handle the toughest calculations while classical hardware manages the rest. These could become as commonplace as a GPU today is in exascale computers and AI.   

Beautiful experimental results from around the world have demonstrated major milestones in making tweezer-based quantum computers real contenders in the zoo of quantum computing possibilities. As with all the rivals, however, scaling the number of qubits while maintaining their fragile quantum states remains a challenge.  

“At the NQCC, we are exploring using dual-species architectures, using both rubidium and caesium in a single glass cell to work towards overcoming these challenges. This requires around twice as many lasers, making dual-species experiments naturally more complex,” he said.  

For now, though, modest victories matter: by next spring Luke Fernley hopes to have a few hundred atom pairs suspended in a vacuum, each a flickering logic element in a machine that computes with uncertainty itself:  he and his fellow quantum computing scientists are learning to compute by choreographing the most delicate dance in physics—one where a single misstep can spoil the whole calculation. 

The post How lasers and atoms could change the future of computation appeared first on Science Museum Blog.

Using Multi-modal Large Language Model to Boost Fireworks Algorithm’s Ability in Settling Challenging Optimization Tasks

arXiv:2511.03137v1 Announce Type: new Abstract: As optimization problems grow increasingly complex and diverse, advancements in optimization techniques and paradigm innovations hold significant importance. The challenges posed by optimization problems are primarily manifested in their non-convexity, high-dimensionality, black-box nature, and other unfavorable characteristics. Traditional zero-order or first-order methods, which are often characterized by low efficiency, inaccurate gradient information, and insufficient utilization of optimization information, are ill-equipped to address these challenges effectively. In recent years, the rapid development of large language models (LLM) has led to substantial improvements in their language understanding and code generation capabilities. Consequently, the design of optimization algorithms leveraging large language models has garnered increasing attention from researchers. In this study, we choose the fireworks algorithm(FWA) as the basic optimizer and propose a novel approach to assist the design of the FWA by incorporating multi-modal large language model(MLLM). To put it simply, we propose the concept of Critical Part(CP), which extends FWA to complex high-dimensional tasks, and further utilizes the information in the optimization process with the help of the multi-modal characteristics of large language models. We focus on two specific tasks: the textit{traveling salesman problem }(TSP) and textit{electronic design automation problem} (EDA). The experimental results show that FWAs generated under our new framework have achieved or surpassed SOTA results on many problem instances.

“David Hockney’s Iconic Art to Light Up Bradford Sky”

“David Hockney’s Iconic Art to Light Up Bradford Sky”

The Art of David Hockney: A Celestial Tribute

In the realm of contemporary art, few names shine as brightly as David Hockney. Known for his vibrant colors, bold compositions, and unique perspective on the world around him, Hockney has captured the imaginations of art enthusiasts for decades. From his iconic swimming pool paintings of the 1960s to his more recent digital works, Hockney’s artistry knows no bounds.

As we look to the stars tonight, we are reminded of Hockney’s monumental impact on the art world. In a tribute to his legacy, some of his most famous works will be projected into the night sky above his hometown of Bradford. This celestial display serves as a testament to Hockney’s enduring influence and the timeless beauty of his creations.

A Legacy Written in the Stars

Throughout history, artists have sought inspiration from the heavens above. From the celestial maps of ancient civilizations to the cosmic motifs of the Renaissance, the night sky has long been a source of wonder and creativity. In honoring Hockney’s art in this celestial tribute, we pay homage to this rich tradition of artists finding inspiration in the stars.

Just as Vincent van Gogh found solace in the swirling night sky of “Starry Night” and Georgia O’Keeffe sought inspiration in the vast expanses of the desert sky, so too does Hockney’s work resonate with the cosmic beauty of the universe. By projecting his art into the night sky, we bring his vision full circle, connecting his earthly creations to the infinite expanse above.

Celebrating Creativity in Every Form

Art knows no boundaries, transcending time and space to touch the hearts and minds of all who encounter it. Whether through paint on canvas, pixels on a screen, or stars in the sky, creativity finds a way to shine brightly in the world. As we gaze upon Hockney’s artistry in the night sky, let us be reminded of the boundless beauty that exists in the realm of human imagination.

Join us tonight as we celebrate the art of David Hockney in a unique and awe-inspiring display. Look up, and be transported to a world where creativity knows no limits, and the stars themselves pay tribute to the genius of one of the greatest artists of our time.

Some of David Hockney’s most famous works will appear in the night sky above the artist’s hometown of Bradford

Read the original article

“UMAP: Visualizing French Communes by Location and Population”

“UMAP: Visualizing French Communes by Location and Population”

[This article was first published on r.iresmi.net, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)


Want to share your content on R-bloggers? click here if you have a blog, or here if you don’t.

A photo of a dodecahedron in the twilight, openwork and lit from within

Floura – Light Bloom – The Art of Hybycozo – Desert Botanical Garden – CC-BY-NC by Alan English CPA

Day 6 of 30DayMapChallenge: « Dimensions » (previously).

According to Wikipedia, Uniform manifold approximation and projection (UMAP) is a nonlinear dimensionality reduction technique. It will allow us to project many dimensions (well, only 3 in this example) onto a 2D plane.

library(sf)
library(umap)
library(dplyr)
library(tidyr)
library(ggplot2)
library(ggrepel)
library(glue)

options(scipen = 100)

Data

We’ll use the french communes (get the data from this post).

com <- read_sf("~/data/adminexpress/adminexpress_cog_simpl_000_2022.gpkg",
               layer = "commune") |>
  st_centroid() |>
  mutate(x = st_coordinates(geom)[, 1],
         y = st_coordinates(geom)[, 2])

UMAP

The dimensions taken into account are: location (x, y) and population. These variables should be scaled but the result is prettier without scaling…

umaps_params <- umap.defaults
umaps_params$random_state <- 20251106

com_umap <- com |>
  st_drop_geometry() |>
  select(x, y, population) |>
  # scale() |>
  umap(config = umaps_params)

res <- com_umap$layout |>
  as_tibble(.name_repair = "universal") |>
  bind_cols(com) |>
  rename(UMAP1 = 1,
         UMAP2 = 2)

Map

res |>
  ggplot(aes(UMAP1, UMAP2, color = population)) +
  geom_point() +
  geom_text_repel(data = filter(res,
                                statut %in% c("Préfecture",
                                              "Préfecture de région",
                                              "Capitale d'état")),
                  aes(label = nom),
                  size = 3, force = .5, force_pull = 0.5, max.overlaps = 1e6,
                  bg.colour = "#ffffffaa", bg.r = .2, alpha = .6) +
  scale_color_viridis_c(trans = "log1p", option = "H",
                        breaks = c(1000, 50000, 500000, 2000000)) +
  coord_equal() +
  labs(title = "Uniform manifold approximation and projection of french communes",
       subtitle = "by location and population",
       caption = glue("https://r.iresmi.net/ - {Sys.Date()}
                      data from IGN Adminexpress 2022")) +
  theme_minimal() +
  theme(plot.caption = element_text(size = 6,
                                    color = "darkgrey"))
Plot on a 2D plane where each point is a french town. The patterns are like a firework
Figure 1: A UMAP representation of the french communes

To leave a comment for the author, please follow the link and comment on their blog: r.iresmi.net.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you’re looking to post or find an R/data-science job.


Want to share your content on R-bloggers? click here if you have a blog, or here if you don’t.

Continue reading: Dimension reduction

Long-Term Implications and Possible Developments of Applying UMAP

The article details the application of Uniform Manifold Approximation and Projection (UMAP), a nonlinear dimensionality reduction technique, to project dimensions onto a two-dimensional plane. In this case, it’s utilized to represent the french communes based on location and population.

The potential implications of effectively implementing and furthering the development of techniques like UMAP are wide-reaching.

Advanced Data Visualization and Analysis

With UMAP, visualizing complex, high-dimensional data becomes easier and more intuitive. Analysts can understand intricate relationships between different parameters more effectively. Future developments might lead to more powerful dimensionality reduction techniques, enabling easier interpretation of even higher-dimensionality datasets.

Improved Efficiency

Reducing dimensionality usually speeds up computation without losing too much information, which can be a significant advantage when dealing with particularly large datasets. The continual amelioration of algorithms like UMAP could lead to even more efficient data processing algorithms in the future.

Enhanced Machine Learning Models

Dimensionality reduction is crucial in many machine learning applications as it can help to mitigate the curse of dimensionality and reduce overfitting. Therefore, improvements in this field can lead to more accurate and reliable machine learning models.

Actionable Advice

The application of UMAP or similar dimensionality reduction techniques could be beneficial for any business or researcher dealing with large, complex datasets. However, it’s crucial to understand where and how to use these tools effectively.

  • Upskill in data analysis techniques: The R programming language offers a wealth of packages and functions, such as UMAP, which can be beneficial for data analysis. Learning to work with these advanced tools would be a profitable upskill move for any data scientist or analyst.
  • Apply dimensionality reduction thoughtfully: While dimensionality reduction can be beneficial to simplify data analysis and visualizations, it should be used judiciously. It is important to understand that some data loss happens during the dimensionality reduction process. It’s crucial to ensure that the resulting model retains the essential features that will give the most accurate outcomes.
  • Keep up with latest developments: The field of data science is rapidly evolving, with new methods and tools developing constantly. Keep up-to-date with the latest research – tomorrow’s groundbreaking technique could be just around the corner.

Read the original article

“Automating Real-Time Search Data Collection with SerpApi for AI Model Training”

Learn how developers and data scientists use SerpApi to automate real-time search data collection for AI model training and analytics workflows.

Analyzing the Key Points of Real-time Search Data Collection

SerpApi is a tool that helps developers and data scientists automate the process of real-time search data collection. This tool is especially geared towards AI model training and analytics workflows. Considering this information, let’s evaluate the long-term implications and possible future developments.

Long-term Implications and Possible Future Developments

The emergence of services like SerpApi is indicative of the increasing reliance on machine learning technologies in the field of data analysis. Over the long term, the automation of search data collection would significantly enhance efficiency in these processes, allowing for more accurate and rapid insights.

The future of real-time data collection services like SerpApi appears bright. As more businesses adopt AI-based technologies for operational efficiency, the demand for similar services is poised to rise. Furthermore, with future advances in AI and machine learning, these search data collection tools may become even more efficient and sophisticated.

Actionable Advice

Based on these insights, it is prudent for businesses and individuals in the field of data science or AI development to familiarize themselves with services like SerpApi. Below is some actionable advice:

  1. Educate Yourself and Your Team: Ensure that your team is proficient in AI model training and analytics workflows. Additionally, familiarize yourselves with tools like SerpApi to navigate the shift towards automated real-time search data collection.
  2. Invest in Automation: Businesses should consider investing in machine learning technologies. These solutions streamline operations, reduce task redundancy, and ultimately improve efficiency and decision-making.
  3. Stay Informed: The field of AI and machine learning is perpetually evolving. It is important to stay up-to-date with the latest developments. This will not only provide competitive advantage, but also facilitate better decision-making strategies in the long run.

By leveraging tools such as SerpApi, you can streamline your data analysis processes, derive insights faster, and ultimately drive your business forward in the AI-driven world of tomorrow.

Read the original article