Home Blog Page 15

Women belong in the Lab: Science, gender and revolution in Lessons in Chemistry

Women belong in the kitchen Lab: Science, gender and Revolution in Lessons in Chemistry

Lessons in Chemistry, a show set in the mid-20th century and based on a famous novel by Bonnie Garmus [1], portrays the life of Elizabeth Zott, an excellent chemist with a hobby of cooking. It explores her journey as she is forced to navigate a world where her intellect is underestimated and her passion for science is constantly challenged by systemic biases.

Elizabeth Zott’s laboratory scenes emphasize the meticulous nature of experimental chemistry, demonstrating techniques such as titration, crystallization, and molecular analysis. From the first episode where she uses the distillation apparatus to make coffee for her colleagues to the scene where she builds a laboratory in her kitchen, the show has a smooth flow with scientific accuracy.

The scientific explanations in the show are easy to understand with the mere use of science jargon. The show was set in the 1950s and most of the scientific advancements of that era are common knowledge today.

chemistry
Elizabeth Zott’s laboratory scenes emphasize the meticulous nature of experimental chemistry, demonstrating techniques such as titration, crystallization, and molecular analysis. Photo Apple TV

The theme of the show is the application of chemistry in daily life. When Zott’s research was stolen, she started hosting a television cooking show and used her platform to teach scientific principles through cooking. This innovative approach serves as a way to communicate chemistry to the public and empowers women by linking scientific understanding to practical, everyday tasks.

For example, Elizabeth explained the chemical reactions involved in baking bread, the role of emulsifiers in sauces, crystallization, and the molecular changes that occur during fermentation. These explanations elevated cooking from a domestic chore to a scientific endeavor, challenging societal perceptions of women, and emphasizing that science permeates every aspect of life.

The show connects chemistry to the kitchen and illustrates the universality of scientific principles and the potential for science to be both accessible and empowering. It is a fun and accessible way of communicating science to a part of society that was discouraged from participating in science. Eventually, it encouraged women to pursue their goals as evidenced by a woman starting medical school in her late thirties after taking inspiration from Elizabeth.

The series also explores the intersection of gender and science, offering a critique of the patriarchal structures that dominated scientific institutions during the mid-20th century.

Elizabeth’s journey highlights the challenges faced by women in science, such as exclusion from research opportunities, lack of recognition for contributions, and the pervasive belief that women were inherently less capable in scientific endeavors. Unfortunately, such discrimination persists as we move up to higher positions or leadership positions in academia even in today’s time.

Her character defies these preconceptions by exhibiting resilience and intellectual prowess. Regarding the historical exclusion of women from science and the wider societal consequences of that exclusion, the play makes a significant remark. Lessons in Chemistry emphasizes the value of diversity in scientific research and the necessity of removing obstacles that impede fair participation by portraying Elizabeth as a chemist who defies gender norms.

In the mid-20th century, Lessons in Chemistry had a big cultural impact due to its scientific content mainly for women. Elizabeth Zott served as an example of how intelligence, willpower, and scientific interest transcend gender boundaries. It conveyed a strong message about persistence in scientific research with a lead female scientist who succeeded despite structural obstacles.

In the first episode she says, “Of course, I would be much further along in my research if I wasn’t making excellent coffee for mediocre scientists.” 

Additionally, the series relates to current debates concerning equity in STEM fields. By drawing attention to the historical obstacles that women in science have encountered, it highlights the continued need for assistance and inclusivity for underrepresented groups in science. The story serves as a reminder that advancing science involves more than just making discoveries; it also entails fostering an atmosphere that allows for the growth of different viewpoints.

chemistry
Calvin’s character highlighted the significance of collaboration and equal partnership in furthering scientific advancement and dismantling social obstacles. Photo, Apple TV

In Lessons in Chemistry, Calvin Evans, a gifted chemist, is Elizabeth Zott’s staunchest supporter and scientific equal. Calvin regarded her as an equal and acknowledged her academic expertise, providing both personal dedication and mentorship.

Calvin’s character highlighted the significance of collaboration and equal partnership in furthering scientific advancement and dismantling social obstacles. Calvin showcased this quality and published their research with Elizabeth as the first author even though the research institute committee opposed it and didn’t let them submit for the grant.

Lessons in Chemistry beautifully combines the emotional weight of personal challenges with the discipline of scientific inquiry to provide an interesting look into science, gender inequity, and resilience. The presentation is thought-provoking and inspirational because of its dedication to illustrating the difficulties of women in science and the intricacies of chemistry.

It does have certain shortcomings— some storylines feel extraneous or unnecessary at points, and the narrative occasionally drifts into over-dramatization, which drifts from the main themes of science and determination.

References:

Also, Read: The Radium Girls – A tale of oblivious poisoning

From Land to Sea— The journey of Whales and Dolphins

Evolution is an incredible story of how life on Earth has grown and changed over billions of years. It’s the idea that every living thing, from the tiniest bug to the largest animal, to the oldest plants, are all connected and have adapted over time to survive. At its core, natural selection is a simple but powerful idea that says creatures best suited to their environment are more likely to survive and pass on their traits.

Over countless generations, these little changes add up and create the amazing diversity of life we see today. You can find proof of this everywhere, from fossils to the DNA we share with other species. So it is safe to say that evolution is the belief that all living organisms are connected and creatures adapt to their environment.

Cetacean on earth

Whales and dolphins belong to the order Cetacea, the marine mammals. Most amazingly, their ancestors once lived on land before they gradually adapted to living in water over millions of years. Fossils found in the present Pakistan and India of those ancient cetaceans are highly useful in understanding their fascinating evolution and the transition from land to sea.

Illustration of cetacean evolution. Credit: Dolphins Way
Illustration of cetacean evolution. Credit: Dolphin Way

This journey explains the flexibility of cetaceans and the remarkable forms they took to thrive in aquatic habitats, such as streamlined bodies, modified limbs for swimming, and an enhanced respiratory system. All these changes helped them to adapt to oceanic spaces to travel and become a part of an intricate social structure. By observing this we can safely say that nature remains at its best through dramatic changes over long time scales.

Relatives of cetaceans

Today’s cetaceans’ closest living relatives are Artiodactyls which include hippos and cows. Their last common ancestor lived around 55 million to 60 million after that cetaceans became semi-aquatic and artiodactyls stayed on land. Cetaceans took to the sea to look for food and protection.

Pakicetus— the first cetacean

Pakicetus was the first cetacean to start being semi-aquatic around 50 million years ago. Pakicetus was a wolf-like animal, its fossil shows that it was semi-aquatic and it could run on land. At this stage, its diet consisted of fish and small rodents.

Pakicetus
Pakicetus skeleton. Credit: Wikimedia Commons

Ambulocetus— the walking whale

About 49 million years ago a cetacean known as Ambulocetus roamed; it was dubbed the “walking whale” because it showed much more aquatic characteristics compared to its previous ancestors. Though it was still land dwelling it showed many characteristics that led it to become fully aquatic, it was about as large as a modern-day lion. It had large paddle-like limbs and a robust body which allowed it to roam on land and swim in water. It also had a vertical tail for propelling itself forward in water. It was a carnivore, its hunting style was similar to a modern-day crocodile, it used to wait in the water and ambush its prey.

Ambulocetus was dubbed the “walking whale.” Credit: Dinopedia

 

Rodhocetus— the biggest transition

About 47 million years ago Rodhocetus became the latest cetacean, it showed a transition towards spending most of its time in water. Its tail evolved vertical fins. It now mainly relied on its tail to swim. A major shift was its nostrils. Its ancestor’s nostrils used to be closer to their snout but now Rodhocetus’s nostrils were closer to the top of the head which would evolve into a blowhole. Its hind limbs were smaller which became a trend in cetacean evolution.

About 47 million years ago Rodhocetus became the latest cetacean. Credit: Eldar Zakirov

Droudon— the first fully aquatic whale

6 to 7 million years later (40 to 41 million years ago), a cetacean known as Droudon swam in Earth’s oceans. It was one of the first cetaceans fully adapted for marine life. It had the appearance of a small whale. Its hind limbs evolved into flippers, essential for a streamlined body and swimming.

The nostrils had shifted to the top of the head and formed a blowhole to surface for air easily without tilting its head just like modern whales. Some skull features hint at early adaptations for enhanced hearing, which would later develop echolocation – the location of objects by reflected sound, in some cetacean lineages (like dolphins, orcas, and some whales).

Droudon was one of the first cetacean, dolphins
Droudon was one of the first cetaceans fully adapted to marine life. Credit: David Arruda Mourao

Basilosaurus— made for the open ocean

About thirty-five million years ago, an 18-meter cetacean known as Basilosaurus swam. It had an eel-like body so unlike the dorudon the basilosaurus swam in a serpent-like motion. Their diet consisted of fish and other smaller marine mammals. It was the first cetacean whose body was made for the open ocean. They adapted earbones that could sense the water currents. This ability was crucial for sensing prey.

Tooth whales and baleen whales 

Finally, some 34 million years ago, basilosaurus split into toothed whales and baleen whales. Toothed whales include dolphins, orcas, and sperm whales. Toothed whales are all carnivores and they mainly hunt in deep dark oceans which is why they all use echolocation. It works when an animal emits clicks or sounds and waits till the sound bounces back to ”see” what is in front of it. Most toothed whales display high levels of intelligence, particularly dolphins who live in pods and coordinate hunting attacks. Toothed whales have been documented to be much smaller than their cousins baleen whales.

Baleen whales are a group which includes almost all types of whales. Baleen whales are filter feeders which is why they have baleen plates to filter large amounts of krill, hence the name. Most baleen whales migrate to maximize food throughout the year. Baleen whales have been documented to be quite large, reaching around a hundred feet and weighing around 190 tons.

River Dolphins

Modern-day cetaceans are still adapting to their environment. Such as both baleen and toothed whales have adapted blubber to keep themselves warm in arctic environments. Cetaceans can also store their oxygen using myoglobin which allows them to dive deep in oceans and it is crucial for sperm whales as they dive to a depth of around 3000 feet. Most whales can slow their heart rate and direct blood to crucial areas only, such as the brain and other crucial organs. This is to conserve oxygen quality which helps baleen whales dive deep.

river dolphins
River dolphin’s neck muscles are much more flexible than their ocean counterpart. Credit: AnimalLife

River dolphins or freshwater dolphins have adopted some very unique features due to their environment, these dolphins have smaller eyes than normal saltwater dolphins, because rivers are quite murky and it is hard to see so this is why river dolphins mostly rely on echolocation. River dolphin’s neck muscles are much more flexible than their ocean counterpart because rivers can be quite narrow at some points so they have flexible neck muscles to maneuver around those narrow points.

References:

Also Read: Adult Fruit Fly Brain Mapped: A Giant Leap to Understand the Human Brain?

Most Popular Stories from Scientia Pakistan in 2024

0

2024 was an eventful year for science and technology, and Scientia Pakistan continued its efforts to bring the best and latest news to our platform. Here are some of the top stories featured on our website this year.

Tap on the titles below to read the stories.

Gaza and the Vicious Cycle of Transgenerational Trauma

Category: Opinion

The ongoing war in Gaza was a heavily covered topic this year. In a conversation with neuroscience researcher Dr Ali Jawaid, our editor highlighted the transgenerational trauma and its impacts on children. Dr Jawaid referenced scientific studies, explaining how the effects extend beyond mental health.

The attack in Gaza and severe violence of human rights has brought an unprecedented level of trauma to the children in Gaza. There is no safe place and no sense of security, with thousands displaced from their homes. Caregivers are themselves fighting for survival and are unable to help children cope with their overwhelming emotional reactions.

The current attack in Gaza has brought an unprecedented level of trauma to the children in Gaza. There is no safe place and no sense of security, with thousands displaced from their homes.
The current attack in Gaza has brought an unprecedented level of trauma to the children in Gaza. There is no safe place and no sense of security, with thousands displaced from their homes. Photo: Arab News

Firewall Misadventure— How Pakistan’s Internet Controls Hurt Innovation and Growth

Category: Technology

The internet is essential for global commerce, innovation, and communication, but internet firewalls and Deep Packet Inspection (DPI) in Pakistan are increasingly used to address cybersecurity threats and content control. While these tools enhance security and content regulation, they also introduce challenges such as slower internet speeds, increased operational costs, and reduced productivity.

According to the Pakistan Software Houses Association (P@SHA), this firewall implementation costs an estimated 30 billion PKR and has resulted in potential losses of up to $300 million. Beyond these immediate financial losses, the decision has damaged Pakistan’s reputation, as a potential tech hub, among international clients, investors, and even its tech innovators.

Global Collaboration across borders is crucial for remote workers. Internet restrictions can disrupt access to international networks and platforms, affecting their ability to participate in global projects and communicate with clients or colleagues. Photo: Unsplash

Microbial life & the Space industry— Do we have all bases covered?

Category: Biology

The discovery of 13 antibiotic-resistant strains of Acinetobacter bugandensis aboard the International Space Station (ISS) sparked an internet frenzy, echoing sci-fi tropes of alien microbes hitchhiking to Earth. While these bacteria are highly resilient to antibiotics due to genetic adaptations to harsh space conditions, routine microbiological assessments on the ISS confirmed they pose no immediate threat to astronauts or humanity. However, the findings highlight the broader global concern of antibiotic-resistant superbugs, driven by the misuse and overuse of antibiotics on Earth.

Doctors come across a plethora of infections caused by bacteria in their clinical practice. While those infections are treated effectively in many instances by appropriate practice, sometimes antibiotics are unduly prescribed, underdosed, or given for durations that are shorter than what would be appropriate for the infection.

Biology stories
Bacterium isolation already has a pedigree of being resistant to antibiotics on earth. Photo: Unsplash

Cooling Karachi – Combating Urban Heat with Green Spaces?

Category: Environment

The impact of climate change is undeniable, and areas like Karachi are becoming extremely vulnerable to it. The lack of trees and green spaces made this year’s summer unbearable and heavily affected the quality of life. Researchers argue that there is a dire need to develop parks and increase plantation efforts so the city has a sustainable cooling atmosphere and a more habitable environment.

The lack of green spaces that provide a cooling effect causes urban temperatures to skyrocket, making summers even more unbearable. Studies have shown that surface temperatures in cities can be a staggering 10-15°C higher than in surrounding rural areas (Mentaschi et al., 2022).  

an edhi volunteer is offering water to a passer by providing relief from the scorching heat in front of the edhi centre in karachi s tower area on april 29 2024 photo express
The Urban Heat Island (UHI) effect significantly impacts the quality of life in Karachi. Photo: Express

A Pioneer with Cracked Space Exploration Policy— Is the Hope Still Alive for Pakistan?

Category: Space

Pakistan’s inaugural lunar mission on May 3, 2024, marked a significant milestone for the country’s space program, reigniting national enthusiasm. Despite being a regional pioneer in space technology in the 1960s with SUPARCO’s establishment and achievements like Badr-I and PRSS-1, Pakistan’s space progress has lagged behind neighboring countries due to economic challenges, limited funding, and gaps in STEM education.

The lack of education and economic challenges are some of the basic hindrances in Paksitan’s space exploration program. Despite the challenges, Pakistan has become the sixth country to launch its first-ever moon satellite: iCube Qamar.

Space stories
Image captured by Pakistan’s satellite iCube-Qamar showing the moon. Credit: Suparco/CNSA

How to connect Doraemon with real life? A science fiction series with imaginative powers for the future

Category: Review

The beloved Doraemon series, a cornerstone of childhood for 90s kids, continues to inspire imagination and creativity in young minds. Beyond its entertaining surface, the show explores profound themes about technology, creativity, and human values. This review discusses how the Doraemon series emphasizes responsible technology use by showcasing the consequences of misuse and reinforcing lessons about honesty, accountability, and entertainment.

The Doraemon series stimulates creative thinking with its concept of a pocket filled with a vast array of gadgets like the Anywhere Door and the Time Machine. It encourages viewers to think outside the box to create new technological possibilities. Storytelling promotes thinking to handle different situations in life and deal with interpersonal relationships creatively.

The Doraemon series portrays a flourishing and positive relationship between humans and robots through the connection shown between Doraemon and Nobita
The Doraemon series portrays a flourishing and positive relationship between humans and robots through the connection shown between Doraemon and Nobita. Photo: Unsplash

Also Read: Scientia’s Science Writing Internship— Bridges the Gap Between Science and Society

Gaia BH3: The Colossal Black Hole Next Door

Black holes are the monstrous cosmic giants present in the vastness of space, devouring everything in their path. If one were ever close enough, the idea that they could one day pull our entire planet into their abyss might have kept you up at night as a child.

Let me bring back a little of that fear. Scientists have just discovered the most massive stellar black hole ever found, and it is not as far away as you would hope—only 2,000 light-years from Earth. That is like having a giant, hungry neighbor down the street in cosmic terms.

What are Black Holes?  

John Wheeler popularized the term black hole in 1967, describing a theoretical concept derived from Einstein’s General Theory of Relativity. Black holes were considered purely speculative until the 1960s when Robert J. Oppenheimer and others suggested they could be real physical entities rather than just mathematical abstractions.

Black holes are cosmic objects with extremely strong gravitational forces that light cannot escape their pull. They remain mysterious phenomena to this day. There are several varieties of black holes, including supermassive black holes, whose origins are still a mystery.

Sagittarius A* is a supermassive black hole located at the center of the Milky Way, theorized to have formed through the merging of multiple black holes or the remnants of successive supernovae. It was not formed directly from a stellar core collapse like Gaia BH3, a stellar black hole.

GAIA
Photo Copyright to ESA / Gaia / DPAC – CC BY-SA 3.0 IGO. Based on Gaia Collaboration, P. Panuzzo, et al. 2024

GAIA BH3

Gaia BH3 is the biggest stellar black hole recently discovered. European Southern Observatory confirmed this discovery on April 16, 2024. Its mass is equivalent to 33 solar masses. It is located in Aquila, the Eagle constellation. Astronomer Pasquale Panuzzo commented:No one was expecting to find a high-mass black hole lurking nearby, undetected so far. This is the discovery you make once in your research life.”(Meet Gaia BH3, 2024).

Gaia BH3 is part of a binary star system with a low-metallicity star orbiting it. How can the star orbit the black hole without being engulfed by it? The answer lies in their distance; the star is too far away to be pulled into its event horizon allowing it to orbit safely. Such black holes are called dormant black holes, they do not radiate and lack accretion disks, so they remain hidden (Sleeping Giant Surprises Gaia Scientists, n.d.).

Wobbling star reveals its existence

Black holes cannot be observed directly but can be detected through their footprints or the signatures they leave behind. The termblackreflects their nature—they do not emit any light.

Similarly, Gaia BH3 caused its companion star’s orbit to wobble, a motion detected by the Gaia spacecraft. The analysis of that data hinted at the presence of a cosmic object with gravity extreme enough to influence the star’s motion.

ESA confirmed it as the Milky Way’s biggest stellar black hole and the second closest to Earth. Carole Mundell, ESA Director of Science, commented on Gaia’s capabilities, It is impressive to see the transformational impact Gaia is having on astronomy and astrophysics. Its discoveries have reached far beyond the original mission of creating an incredibly, precise map of over a billion stars in the Milky Way.”

DISCOVERY: The Gaia mission

The European Space Observatory launched the Gaia mission in 2013, to make a three-dimensional map of the universe. Another purpose was to collect data about the positions and radial velocities of stars and other cosmic objects to study their properties precisely. It can chart up to 1000 million stars accurately, enabling astronomers to conduct statistical analysis. This is indeed just 1 percent stars of the Milky Way (Prusti et al., 2016).

Gaia spacecraft is currently present at Lagrange-2 point and uses the principles of astrometry to observe the changes and velocities of cosmic objects; it is 200 times more accurate and produces data for more stars as compared to its predecessor Hipparcos(Gaia Overview, n.d.). Hipparcos was only able to chart the positions of 100,000 stars.

Jos de Bruijne, Gaia’s deputy project scientist at ESA, described the challenge as being the same as if you are trying to tell the shape of a building while you are inside it.He emphasized that understanding the galaxy’s shape requires knowing the exact positions of its stars, which are both very far apart and very distant from us.To accurately measure their positions in three dimensions requires extreme precision, he added(published, 2022).

Gaia BH3 has shocked astronomers with its strangeness and the fact that it remained hidden for so long. However, the Gaia mission has changed all that—now it is all about knowing where to point your telescopes, as a famous quote in the world of astronomy goes. This mission still holds much promise, and future data releases are expected to bring more discoveries and secrets to uncover, such as the search for exoplanets.

The upcoming Gaia Data Release 4, New Star and Solar System Planetary Catalog anticipates uncovering numerous binary systems with dormant black hole companions. Each discovery will provide a chance to test different theories about how Gaia BH3 was formed. Astronomers are now focusing on preparing for the release of DR4 at the end of 2025.

Gaia’s discoveries continue to push the boundaries of what we know about the universe, and with every new release, our understanding of the cosmos grows even deeper.

Don’t you start panicking about Gaia BH3 pulling Earth into a cosmic abyss? That childhood fear of black holes gobbling up our planet? Yeah, this is not happening. It is still 2,000 light-years away, and Gaia BH3 isn’t exactly speeding our way. So, feel free to relax, take a breath, and enjoy your day— there’s no need to start preparing for an interstellar apocalypse yet!

References:

  1. Gaia mission detects the most massive black hole of stellar origin in the Milky Way. (n.d.). Current Events. Retrieved December 1, 2024, from https://web.ub.edu/web/actualitat/w/missio-gaia-forat-negre-massiu
  2. Gaia overview. (n.d.). Retrieved December 1, 2024, from https://www.esa.int/Science_Exploration/Space_Science/Gaia_overview
  3. Meet Gaia BH3, our galaxy’s most massive stellar black hole. (2024, April 17). https://earthsky.org/space/gaia-bh3-milky-ways-most-massive-stellar-black-hole/
  4. Prusti, T., Bruijne, J. H. J. de, Brown, A. G. A., Vallenari, A., Babusiaux, C., Bailer-Jones, C. a. L., Bastian, U., Biermann, M., Evans, D. W., Eyer, L., Jansen, F., Jordi, C., Klioner, S. A., Lammers, U., Lindegren, L., Luri, X., Mignard, F., Milligan, D. J., Panem, C., … Zschocke, S. (2016). The Gaia mission. Astronomy & Astrophysics, 595, A1. https://doi.org/10.1051/0004-6361/201629272
  5. published, T. P. from E. H. (2022, June 13). Gaia: Mapping a Billion Stars. Space.Com. https://www.space.com/41312-gaia-mission.html
  6. Sleeping giant surprises Gaia scientists. (n.d.). Retrieved December 1, 2024, from https://www.esa.int/Science_Exploration/Space_Science/Gaia/Sleeping_giant_surprises_Gaia_scientists
  7. Gaia. (n.d.). European Space Agency (ESA). Retrieved November 24, 2024, from https://www.esa.int/Science_Exploration/Space_Science/Gaia
  8. Panuzzo, P., Mazeh, T., Arenou, F., et al. (2024). Discovery of a dormant 33 solar-mass black hole in pre-release Gaia astrometry. Astronomy & Astrophysics, 686(L2). https://doi.org/10.1051/0004-6361/202449763

 Also Read: THE FIRST PHOTO OF OUR MILKY WAY’S BLACK HOLE REVEALS

Sands of Time— Denis Villeneuve’s Dune, Part One Redefines Epic Sci-Fi

0

Frank Herbert’s nuanced classic and futuristic epic “Dune” has been adapted into a science fiction magnum opus by Canadian filmmaker Denis Villeneuve. Being a more critically acclaimed version, the film profoundly portrays Herbert’s omniscient world-building in an astounding cinematic spectacle.

The novel, being a million-dollar best-seller, is considered “one of the greatest science fiction novels of all time.” Adapting it onto the big screen is no less than a miracle, as Villeneuve manages to cover the 800-page complex life of Paul Atreides in Arrakis over two hours and thirty-five minutes of run-time.

Dune Part one
Vast dunes of Arrakis. Photo: Freepik www.freepik.com

The film takes place in a futuristic setting in the year 10191. The story follows Paul Atreides played by Timothy Chalamet who is sent to the planet Arrakis with his parents, Lady Jessica and Duke Leto Atreides, played by Rebecca Ferguson and Oscar Isaac, to take over after the former rulers, the ruthless Harkonnens. Arrakis, informally known as Dune, is a fictional planet featured in Herbert’s novel and is integral to the film’s central plot for several reasons.

Despite being inhabitable and protected by massive sandworms, it serves as grounds for harvesting “spice” as whoever has control over Arrakis has control over spice. Additionally, it is a passage to interstellar travel, of which only the savage Fremen hold knowledge. As Paul’s fate as future ruler hangs in the balance, he navigates the impenetrable landscape of Arrakis with the help of the Fremen, letting the Dune saga unfold.

The casting is impeccable as Timothy Chalamet brilliantly embodies the role of Paul, a teenage boy, who is unaware of how to utilize his position after his father, Duke Leto. Lady Jessica played by Ferguson effortlessly personifies Paul’s dominant mother and her exposition of Bene Gesserit in the novel is equally impressive and a personal favorite.

The film also does a commendable job of incorporating intricate themes to complement the film’s central plot for example Paul’s prophecy as Muád’Dib, the discreet operations of Fremen, and the sinister plotting of the Harkonnens. The viewers are kept in the loop of various events occurring one after the other in the Dune universe which hint at something larger in the future. However, some events in the film may give a superficial impression to the viewers who have not read the novel in the first place.

Nonetheless, Villenuave compensates for this lack of context by showing Paul’s visions or dreams in which he meets Chani, another character, played by the enigmatic Zendaya. This meeting foreshadows Chani’s significance in Paul’s future journey amongst the Fremen.

Dune part one
Beneath Arrakis’ twin suns. Photo: Freepik www.freepik.com

Unlike the intended illusionary vision Herbert had for his novel, Villeneuve’s direction for Dune differs, as Villeneuve directs the movie in more of his signature style as seen in his prior films namely Blade Runner 2049 and Arrival. David Lynch, on the contrary, directed Dune in the year 1984, per the novel’s psychedelic ambiance.

With a distinct futuristic cinematography style, Villeneuve carefully orchestrates wide-shot images of sandstorms, spaceships, and Paul’s visions through the equivocal and extraordinarily gifted cinematographer, Greig Fraser. Both Fraser and Villeneuve make use of long monologues in the novel to visualize Arrakis mostly from Paul’s point of view throughout the film.

My favorite aspect is the utilization of a visceral monochromatic color palette which goes hand in hand with Greig’s shooting style. For this purpose, Greig particularly insisted on the use of shooting the scenes in natural light with sand screens to create proper reflection. In addition to this, both Fraser and Villeneuve experimented with adjusting camera angles for shooting technical scenes such as the scene with the ornithopter spinning around the worm.

The soul of Herbert’s book lies in establishing a link between the ecosystem and spirituality and the director-cinematographer duo masterfully engineer this phenomenon through specific shots in the film. Greig’s ability to fully immerse himself in various cultures and capture relevant moments coupled with Villeneuve’s scientific documentation takes a coming-of-age approach to this book-film adaptation and is indeed genius, if you ask me.

Secondly, if we were to talk about the costume design, the outfits for every character in the film are meticulously selected according to their character arcs. They are crisp, antique, and seem like they weigh a ton, at least!

Dune part one
Ornithopters in Arrakis. Photo:Freepik www.freepik.com

If you think production design and cinematography are the icing on the cake, wait till you are mesmerized by the stunning editing and soulful music score. German film composer, Hans Zimmer, reunited with Villeuneve for Dune after Blade Runner 2049 for this project. Despite Zimmer’s former supposed commitment to Cristopher Nolan’s Tenet, he picked Dune instead owing to his childhood love for Herbert’s novel. Paired with BAFTA winner Joe Walker’s intuitive and timely editing, the film creates its rhythmic beats, certainly serving the cherry on top!

This book-film adaptation stands out because of Herbert’s take on the Bedouin culture, which is intricately woven into the characters’ lives in the Dune universe. As a Muslim viewer, reading the novel and watching the film in the theater kindles many emotions as you relate to the Islamic themes of dynasty politics, pilgrimage, and prophecies. It is both surreal and nostalgic. Consequently, to include an Islamic narrative in a pro-westernized film fraternity only to take it up as a massive budget project had me sold for the first day, first show!

Dune is a relatively slow-paced film and can be watched without reading the novel. The film’s marketing, however, was overblown but that might have been necessary to compensate for the large-scale investment in giving life to Herbert’s Dune universe. As most scenes are infused with imagery and colossal visuals, the prospects of experiencing individual character arcs, except for Paul, relatively become less.

Though, for sci-fi zealots like myself and those who have long treasured Herbert’s Dune, Villeneueve has certainly done justice to the book-film adaptation we all dreamt of. All in all, Dune is a bang for the buck! It’s both an auricular and visual feast with an ensemble cast, leaving viewers craving for more—be sure to watch part two where the dunes of Arrakis promise even stronger battles and revelations!

References:

Also, Read: Review: The Silent Contribution of Science Fiction to the Technological Advancements

 

The facts about Al Beruni’s experiments at Nandana, Pakistan

0

Nandana is situated in District Jhelum, Pakistan, about 60 miles southeast of Islamabad in a straight line and can be reached by road in less than 3 hrs. In the long past, Nandana was a Capital city of historic importance and also was an administrative district until the second half of the ı8th century A.D. It had remained more or less inhabited up to the 18th century but was abandoned thereafter and the population shifted to Baghan wala down below in the plain.

Presently the site is abandoned and has recently been protected by the Department of Archaeology.  To this day, as one visits it, one cannot miss the conspicuous sight of the high mount or its peak point to which Beruni had once claim bed up to take measurements for his experiment. These stand out clearly in the light of Beruni’s own observations.

For a long time, Beruni was anxious to access the sources of Hindu literature, astronomy, and other sciences. During the period of his service (399-406 A.H) with  Abul Abbas Mamoon Khwarizmshah, he became better acquainted with the power and position of Sultan Mahmud and the importance of the Ghazna court as a gateway to India.

During his stay at Nandana, Beruni had accomplished more than one task. By halting at Nandana and making it a center for his inquiries, Beruni extended his visits into the surrounding region, which was also rich in minerals, and Beruni was equally interested in precious stones.

In Nandana, Beruni observed the latitude of the place, which he noted in his Kitâb al-Hind, and the latitudes of other places he had personally visited. The figure for Nandana (as given in the printed edition) is 32′-o. 60 Later, Beruni also calculated the longitude of Nandana (from the westernmost Coastal point of Maghrib, North Africa), revised the figures for its latitude, and recorded both figures in his al-Qanun-al-Masudi. In the printed edition of this work, the longitude and the latitude of ‘Fort Nandana’ are 94° — 43′ (E) and 330— 10′ (N), respectively.

Al Beruni, Nandana experiments
Diagram illustrating a method proposed and used by Al-Biruni to estimate the radius and circumference of the Earth. Credits: Wikimedia Commons

Nandana: The Scene and setting for the experiment

Beruni’s approach to Nandana was natural from the northwest along the age-old route, traversing the roof-like elevated Ara valley and then descending south-eastward (from the sector of the present Ara village and the Ara Rest House) towards the Nandana Pass. The environment enrooted to Nandana, as it would have appeared and impressed Beruni, can best be visualized through the vivid description of it left by a modern researcher (Sir Aurel Stein) following the footsteps of Beruni: “I may now proceed to give an account of the route leading down from the Salt Range through the Pass of Nandana, and the remains of the ancient stronghold.

From the elevated ground of the Ara Plateau, at the height of about 2,400 feet, a steep winding road leads down over the rocky scarp of the range for close to 2 miles to where a small dip, about 200 yards across, at an average level of 1,300 feet stretches between two small valleys drained by streamlets which further south unite below the ruined stronghold of Nandana.  

Immediately above the dip referred to, which forms a natural fosse, raises the bold rocky ridge of Nandana very abruptly. On its top, at the height of about 1,500 feet above sea level, it bears conspicuous ruined structures, and along the precipitous northern slopes below these, the remains of a boldly built one of the walls, defended by bastions.

This fortified ridge completely bars further descent on the route; for the two small valleys above-mentioned contract on either side of it into deep and extremely narrow gorges, and descend for some distance between almost vertical rock walls, hundreds of feet high.” 62 As one approaches the site, the rocky ridge’s northern slope on which stood the fortified inner city becomes prominent.

Before negotiating its bottom line, one passes through the ruins of the outer quarters of the city. Proceeding further and following the track higher up on the slope, the massive foundations of the fortification wall skirting around this northern side and the remnants of the gateway leading into the walled city.

Though Beruni had a partial view of the plain through the Nandana Pass from his own quarters, he could not fully view it unless he either went on the other side of the Pass or climbed up the mountain. He preferred to go up to the mountain top to size up the plain and the peak point where vertical measurement to its foot could be taken.

Built to deter foreign invaders, little remains of Nandna Fort today -  Newspaper.
Temple at Nandana Fort. Credits: Dawn

To do so, he must have come out of the fortified part of the city, passed through the lower part of the city, traversed a long way toward the north-west, crossed the shallow rivulet waters flowing downwards into the gorge, climbed up the slopes of the spur along its north-western shoulder and reached high up on the top before he could have a full view of the plain. When he did so and had a full view of the vast level plain extending southward far off to the horizon, he took the final decision to try out his new method for determining the dimensions of Earth.

The problem and the method for its Solution

Al Beruni concluded that the method of finding, by trigonometrical calculation, the circumference or other dimensions of Earth by observing the dip of the horizon from the peak of a mountain was a fresh contribution by Beruni. He applied this method for the first time in his Nandana Experiment during 4 11-4 14 A.H.

He ascertained the sight line (extending) from the mountain peak (and) touching where the earth and the blue sky met (the horizon). The line so visualized from (my) standing position (on the peak) dipped against the (horizontally) fixed line by (an angle of) o° 34′. T h en I measured the (peak to bottom) perpendicular height of the mountain and found it to be 652-3-58751 zirac (cubits), reckoned by the z^âc used as a cloth measure at that place (Nandana). Now angle T is a right angle, angle K equal to (the angle of) the dip (° 34′), and angle H as its complementary = 89° 26′.

So if the angles of the triangle H T K are known, its sides will also be known by the proportion of T K and sines lotus ( = 1). By this proportion, T K will be 590 59′ 49″ while the excess between it and sinus lotus is o° o 11″. But that is the perpendicular height HL which is known in zirac, and the ratio of its (HL) zirac to the zirac of L K is the same as the ratio of o° o’ 11″ to 590 55″.

Reference: Al Beruni’s Nandana experiment by N A Baloch

Also read: A Noble Laureate’s Noble Gesture for His Teacher

Rapid Innovation in Quantum Computing is Reshaping the Landscape of Digital Security

Recent advancements, such as China’s Tianyan-504 system and Google’s Willow chip, demonstrate substantial progress in designing qubits—quantum units of information that exploit superposition and entanglement to process data at scales unimaginable with classical bits. Although these devices remain technically challenging and limited in scope, the rate of improvement suggests a future in which quantum hardware may break through longstanding cryptographic defenses.

This shift threatens conventional encryption methods used in financial transactions, government communications, and personal data protection and reverberates through Web3 ecosystems. Decentralized platforms, blockchain-based financial instruments, and tokenized digital assets rely heavily on cryptographic primitives that could become vulnerable to quantum attacks. Organizations and communities worldwide now face a pivotal choice: adapt to this new reality by adopting quantum-resistant solutions or risk exposing their digital infrastructures to unprecedented threats in the years ahead.

Quantum Hardware and Algorithms: Redefining Security Threats

At the heart of quantum computing’s promise lies the qubit, a fundamental building block that can exist in multiple states simultaneously. Engineers struggle to keep qubits coherent for extended periods, maintaining them at cryogenic temperatures and shielding them from the slightest interference.

Error-correction techniques must delicately monitor these states without collapsing them into classical outcomes. The Willow chip’s refined error management and Tianyan-504’s large qubit count point toward more stable systems. However, scaling from a few hundred to thousands or millions of reliable qubits remains a colossal challenge.

Despite these hurdles, the theoretical capabilities of quantum algorithms pose grave implications. Shor’s algorithm, for example, drastically reduces the difficulty of factoring large integers—a cornerstone of RSA-based encryption. Breaking RSA in a reasonable timeframe would upend traditional public-key systems that currently protect sensitive information.

Similarly, Grover’s algorithm accelerates brute-force searches, potentially weakening symmetric encryption methods by reducing the time required to guess keys. Although present-day quantum machines cannot yet implement these algorithms at the scale needed to shatter modern encryption, the trajectory is clear: once error-corrected, high-qubit systems emerge, cryptographic assumptions once seen as unbreakable may fail.

This looming threat extends beyond conventional cybersecurity models. Adversaries might already capture encrypted data, but they plan to decrypt it years later when quantum hardware matures. Long-lived secrets—state documents, corporate intellectual property, or sensitive health records—become vulnerable to a “harvest now, decrypt later” strategy. The prospect of future quantum decryption elevates the urgency of preparing defenses today, rather than waiting for quantum supremacy to catch organizations off guard.

Quantum Vulnerabilities in Web3 Ecosystems

The Web3 movement envisions a decentralized internet powered by blockchain technology, decentralized finance (DeFi) protocols, non-fungible tokens (NFTs), and smart contracts executing across distributed networks.

These platforms depend on cryptographic mechanisms to maintain trustless environments, secure digital identities, and manage tokenized assets without centralized intermediaries. Private keys underpin the ownership and transfer of cryptocurrencies and tokens, while secure hashing functions and digital signatures preserve network integrity and ensure participants adhere to protocol rules.

Quantum Computing
Failing to address quantum risks may erode confidence in decentralized ecosystems, leading to market instability, devalued assets, and lost user trust. Photo generated by AI

Quantum computing threatens to undermine these foundations. If malicious actors harness quantum algorithms to derive private keys from public addresses or forge digital signatures, they could manipulate smart contracts, drain liquidity pools in DeFi applications, counterfeit NFTs, or sabotage blockchain consensus. The ramifications would be devastating for users who trust the immutability and cryptographic reliability of these systems.

The decentralized nature of Web3 complicates the defense. Network-wide algorithmic upgrades require consensus among diverse participants—miners, validators, developers, and token holders—making transitions to quantum-safe cryptography a complex social and technical endeavor. Failing to address quantum risks may erode confidence in decentralized ecosystems, leading to market instability, devalued assets, and lost user trust.

Quantum-Resistant Cryptography and Defensive Strategies

Anticipating these challenges, researchers and standards bodies have focused on post-quantum or quantum-resistant cryptography. Unlike current methods that rely on problems easily solved by Shor’s or Grover’s algorithms, quantum-resistant schemes emerge from different mathematical foundations. Lattice-based cryptography, for instance, exploits the complexity of finding short vectors in high-dimensional grids. Code-based systems use error-correcting codes to present problems resistant to known quantum approaches. Multivariate cryptography and hash-based signatures add further variety, each grounded in assumptions that remain robust against quantum assaults.

International efforts, including those led by the U.S. National Institute of Standards and Technology (NIST), aim to standardize these new algorithms. The selection process involves rigorous security analysis, efficiency testing, and implementation checks. Once a stable of proven quantum-resistant algorithms is established, migrating classical and decentralized systems to these standards will become a priority. Financial institutions, government agencies, and Web3 developers can then adopt these algorithms to safeguard future transactions and communications.

On top of cryptographic shifts, other quantum security tools offer additional resilience. Quantum key distribution (QKD) uses quantum states to exchange keys securely, revealing any eavesdropping attempt. Though challenging to implement at large scales and not a panacea, QKD could complement quantum-safe encryption methods, establishing a multilayered defense for critical connections. Meanwhile, quantum-secure protocols might enhance authentication systems, detect anomalies more efficiently, or ensure data integrity, turning quantum principles into defensive assets rather than threats.

In the Web3 arena, upgrading smart contracts to incorporate quantum-resistant keys and adjusting hashing algorithms become vital tasks. Developers may deploy hybrid approaches, mixing classical and post-quantum cryptography to ensure backward compatibility while incrementally strengthening security. Such gradual transitions help prevent sudden shocks and maintain user confidence.

Quantum Computing
Protocols might establish timelines for phasing in quantum-safe schemes, ensuring that wallets, node software, and decentralized applications support new cryptographic primitives. Photo generated by AI.

Navigating the Post-Quantum Transition and Future Outlook

The current limitations of quantum hardware give defenders a valuable head start. The quantum machines of today, including Tianyan-504 and Willow, remain at a proof-of-concept stage, still grappling with error rates and coherence issues. Yet, ignoring this window of opportunity would be shortsighted. Organizations must inventory cryptographic assets, identify vulnerable algorithms, and plan orderly migrations to quantum-resistant solutions. The cost of inaction grows with each step quantum computing takes toward feasibility.

For Web3 communities, consensus-based upgrades may require on-chain governance votes or carefully orchestrated forks. Protocols might establish timelines for phasing in quantum-safe schemes, ensuring that wallets, node software, and decentralized applications support new cryptographic primitives. This collaborative adaptation maintains the core principles of decentralization—open participation, transparency, and stakeholder input—while strengthening security foundations.

Ultimately, quantum computing’s influence on cybersecurity and Web3 can be managed through foresight and preparation. Rather than reacting to a crisis once a powerful quantum machine is unveiled, the global community can adopt preventive measures now. Incorporating quantum-safe cryptography, experimenting with quantum-secure protocols, and preparing migration paths for decentralized networks position organizations and users to weather the quantum transition.

This proactive stance preserves the integrity and functionality of financial services, digital marketplaces, and governance mechanisms that define the decentralized internet. It reassures participants that their assets and identities remain protected even as computational frontiers expand. Quantum computing may reshape cryptographic challenges, but with careful planning and timely implementation of new standards, the promise of a secure digital ecosystem—classical or quantum—can endure.

References

Also, Read: Quantum Computing 101

NASA’s Parker Solar Probe Breaks Records: Closest-Ever Encounter with the Sun on Christmas Eve

Yesterday, on Christmas Eve Dec. 24, at 6:40 AM EDT, NASA’s space probe, “Parker Solar Probe,” approached the closest to the Sun, with an approximate distance of 3.86M Miles. The spacecraft was launched in 2018 to observe our sun and its “Outer Corona”. Interestingly, the probe is the fastest object ever built by humanity, as fast as 690,000 km/h or 191 km/s, nearly 0.064% of the speed of light.

Dr Nicola Fox, head of science at NASA, told BBC News: “For centuries, people have studied the Sun, but you don’t experience the atmosphere of a place until you actually visit it.

Since the closest approach is 3.8M miles (6.2M km) from the surface of the Sun, that doesn’t sound close. Still, NASA’s scientist Dr. Nicola Fox puts it like this, “We are 93 million miles away from the Sun, so if I put the Sun and the Earth one meter apart, Parker Solar Probe is four centimeters from the Sun – so that’s close.”

The probe has been designed to withstand high temperatures of up to 2,500° Fahrenheit (1,370° Celsius) with its Thermal Protection System. During its flyby, the probe will endure temperatures up to 1,400C and this amount of radiation could frazzle its onboard electronics. It is protected by the shield, which is 11.5cm (4.5 inches), and made of carbon composite, and the spacecraft is expected to tactically get in and out fast, with the highest possible speed ever achieved by any human-made object yet.

In fact, in human terms, this is the equivalent of catching a flight from New York to London in less than 30 seconds, with a mindblowing speed of 430,000mph. This massive speed comes from the gravitational pull it faces from the sun, as travels through the Perihelion.

Sun
Photo: NASA

One of the very longstanding mysteries is the outer atmosphere of the Sun, which we call the “Corona”, solar physicists know that “The corona is really, really hot and we don’t know its explanation” – Shockingly, the surface of the Sun is about 6000C(5772K) but its corona, the outer atmosphere is measured to be millions of degrees, which we see during solar eclipses. The question is, how is the atmosphere is more hotter?

Through this mission, scientists to understand the Corona – the constant solar wind bursting out of it. We also see these particles interacting with our Earth’s magnetic field and give us the views of beautiful “Northern Lights”. This also causes space weather problems, causes problems such as solar pressure for our satellites, electronics, and communication systems.

Sun
Interactions between the radiative and convection zones within the Sun’s interior contribute to heating our star’s corona. Astronomy: Roen Kelly

That’s why scientists, think, that “By understanding the Sun and its activity in detail, the space weather, the solar wind, we can understand more about its effect on our daily lives”

NASA scientists had waited for this Christmas for the Solar Parker Probe to reach the nearest flyby of the Sun, and still be ‘safe’ for future observations, and they recently tweeted, through @NASASun, “Parker is amid its flyby and can’t communicate with us until Dec. 27, when it will send its first signal to let us know it’s safe.” – Hopefully, it will be safe and help us explore the solar secrets to a more detail.

More from the Author: The Fundamentals and Applications of the Blockchain World with Boone Bergsma

Solving the Protein Folding Problem: A Journey from Experiments to AI Algorithms

The Nobel Prize in Chemistry this year demonstrates the rise of the power of computational and AI tools to assist scientists towards greater inventions.

In the early 20th century, Alois Alzheimer, a psychiatrist and neuropathologist, observed some abnormal webs and tangles under the microscope in postmortem brain samples of people who suffered from early-onset memory loss (dementia). He, however, could not identify what these were made of. Over the years, scientists have identified these as clumps of misfolded proteins; the condition of dementia is now named after Alzheimer’s.

Proteins are chains of amino acids, one of the building blocks of life, so much so that many scientists believe that the formation of amino acids on earth is a significant step to the origin of life. There are thousands of proteins in the human body that perform diverse functions: name a function, and there is invariably a protein associated with it.

Intriguingly, proteins, made by forming chains of different combinations of a mere 20 amino acids can show this large diversity of functions. All boils down to the way protein is folded, or as scientists call ‘native structure’: how the string of amino acids is arranged in 3D space.

A protein can perform its assigned function only if properly folded. A denatured protein (one that has lost its 3D structure, like an open random coil) or a wrongly folded one doesn’t. Misfolding of proteins is linked to several debilitating conditions like Parkinsons’, Amyotrophic Lateral Sclerosis (ALS), and Alzheimer’s, as was observed in the microscope by Alzheimer.

There are however millions of ways in which a protein could fold. Imagine millions of rugged valleys over a vast landscape and the goal is to throw a stone into the deepest valley. The same analogy applies to finding the native structure of proteins. This is termed as the ‘protein folding problem’.

Christian Anfinsen of the National Institute of Health (NIH) in 1961, observed that denatured proteins can fold back to their original functional state in a matter of few seconds. That the proteins manage to do so in such a short time frame, even in the presence of a multitude of possibilities, is a paradox, now famously known as Levinthal’s Paradox after Cyrus Levinthal, a scientist at MIT who proposed this in 1968.

These observations suggested that the 3D structure is coded in the sequence itself and that some important physical forces are in play that direct the protein to be folded a certain way, making its most stable state (the native state) easily accessible, rather than searching for the stable state randomly.

Protein Structure: The History

Scientists have been working on identifying the structure of proteins since 1930. The landmark discovery was when Kendrew and Marx Perutz figured out the structure of myoglobin and hemoglobin the oxygen-storing and carrying proteins respectively, using X-ray crystallography, pretty much like photographing the atoms of protein with X-rays. Identifying the protein structure, though, is not an easy task. It requires weeks or months of painstaking experiments and analysis.

In the early days of crystallography, scientists would spend years trying to crystallize proteins to study their structures, and many proteins simply couldn’t be crystallized. Several more months were required to analyze the experimental outcomes and come up with a sensible structure. Over the years, the field of structural biology evolved, with people finding structures of more and more proteins. More advanced tools including cryoelectron microscopy and NMR were being used routinely to study protein structure.

John Kendrew (left) and Max Perutz with their model proteins
John Kendrew (left) and Max Perutz with their model. Credit: Medical Research Council Laboratory of Molecular Biology, UK

Dr. Mohd Taher, a postdoctoral researcher working on proteins and enzymes, in the Department of Chemistry, University of Illinois, Urbana-Champaign, USA says, “Seeing is believing”. Although researchers were successful in identifying the protein structures, one question remained largely unanswered: given a sequence of amino acids, is it possible to predict the native structure?

This quest inspired a group of structural biologists to start a friendly competition every two years called CASP (Critical Assessment of Protein Structure Prediction), with the motive of enhancing the pace of the advances, where the participants used their models to predict structures of proteins whose structures are not yet publicly available.

Some of the earlier winners tried predicting the structures based on the physicochemical properties of amino acids and how they interact with each other to model how these interactions will direct the 3D structure formation. Some came up with the idea of looking at several related proteins to find the pattern of how similarly coded regions fold.

Yet others looked at amino acids that got mutated together during evolution and postulated that if they changed together, they should be close to one another influencing one another in the folded state. The success of prediction, however, remained bleak, mostly with less than 50 percent accuracy.

A gamechanger in protein folding problem

However, the CASP competition of 2020 was a game-changer in the field of structure prediction. Researchers from Google’s startup Deepmind, John Jumper, David Hassabis, and their team showcased their algorithm, AlphaFold2, built with improved deep learning algorithms, which used “transformers” to learn from hundreds of thousands of known protein structures, and used this learning to predict the structures of a new protein.

The earlier version, AlphaFold1, presented at CASP in 2018 with algorithms based on convolutional neural networks was placed among the first 5 competitors. Deepmind’s algorithm outshone other competitors by a large margin. The jury of the CASP was in for a surprise by the result in front of them: AlphaFold2 managed to produce structures that were more than 90 percent accurate on the tested proteins.

AlphaFold 2 performance, experiments, and architecture of proteins
AlphaFold 2 performance, experiments, and architecture. Credit: Wikimedia

The team described that they designed novel “training procedures based on the evolutionary, physical and geometric constraints of protein structures.” In the study published in Nature, they discuss the structure of the neural network used to train AlphaFold.

“The complex layers of neural networks succeeded in learning the outcomes of the physical processes of protein folding, capturing effects such as the propensity of some amino acids to form certain shapes, like alpha helix and beta-sheets, and the interactions of amino-acids with the surrounding environment (water and other amino acids)”, says Taher.

AlphaFold had managed to predict the protein structure of an amino acid sequence in mere minutes as compared to experiments that took several months. “AlphaFold however, cannot replace experiments. The final validation requires an experimental structure determination”, says Dr. Natesh Ramanathan, Associate professor in the School of Biology and Center for High-Performance Computing (CHPC), Indian Institute of Science Education and Research, Thiruvananthapuram, India.

Talking about the significance of prediction tools in protein research, Dr. Natesh said “These computational tools help in speeding up experimental identification of protein structures, allowing researchers to focus on more advanced problems.”

Is AlphaFold memorizing instead of learning?

The success of AlphaFold in the accurate prediction of protein structures is no doubt one of the best examples of the AI revolution in science. However, there is still scope for improvement. A recent case study by a team at NIH, Bethesda, USA showed that AlphaFold fails to predict the structures of proteins that can switch shapes as part of their function.

They showed evidence that the algorithm at some point had started to memorize the patterns rather than learning them, leading to incorrect predictions for more complicated structures. Dr. Natesh says, “As is the case for any method of bioinformatics, AlphaFold too is only as good as the database it is trained on.”

However, when scientists rely on increasingly sophisticated computational tools to predict protein structure, there also comes a downside: it is quite difficult to decode what are the important factors that contribute to the final result. AI algorithm works as a black box that spits out protein structures, leaving the researchers still wondering what factors led to this structure.

Digitally rendered image of a protein structure prediction by AlphaFold
The success of AlphaFold in the accurate prediction of protein structures is no doubt one of the best examples of the AI revolution in science. Credits: DeepMind

It is also not clear if the models have learned some new physics that humans have not yet figured out. It is an interesting question since machine learning algorithms are designed to identify patterns that might be invisible to humans. This might be the case, but it is difficult to tweak this information.

While the researchers can now predict more accurate structures, the fundamental questions, what the complete physics underlying protein folding is, and how the process happens so fast remain. According to Dr. Natesh, “In the A to Z of protein folding problem, steps B to Y are still unsolved”.  But for many, many important applications, one can work with the output structure, without worrying much about how the algorithms zeroed in on it.

From prediction to design

While many were interested in solving the protein folding problem, David Baker of the Institute of Protein Design, University of Washington, wished to go a step further. One of the regular participants in CASP, Baker was working on protein structure prediction, developing an algorithm called Rosetta, based on modeling the interactions between amino acids to predict the structure.

He envisaged an idea, why not use the existing knowledge of preferences of protein folding, to design a completely new protein, a string of amino acids that might fold into a shape for a specified function? This is essentially the reverse problem of the one that AlphaFold addresses.

This problem is considerably different from protein engineering, which has been around for a while: modifying existing proteins to improve efficiency or perform new functions. Smaller steps in this direction were taken by other research groups by the end of the 1980’s, to make short strings of amino acids called peptides, inspired by naturally occurring proteins.

The arrangements were predicted taking into account that some of the amino acids are hydrophobic (molecules that stay away from water) in nature while some are hydrophilic (molecules that like to interact with water). But it was David Baker’s group in 2003 that succeeded in the remarkable feat of computationally designing an entirely new protein whose structure or amino acid sequence bore no similarities to the known protein structures.

“This was something that was never achieved before”, said Dr. Natesh.  “Not only did they design a protein made of 93 amino acids (now called Top7) “de-novo” (meaning anew)using computational tools, but they validated it using crystallographic techniques.”

This had profound implications in many different fields including medicine, health, and biotechnology. A group in the Institute of Protein Design used computational protein design to develop a vaccine for the SARS-CoV virus. “It’s exciting”, says David Baker, in an interview on the Nobel Prize website.

Way Forward

Both these feats, which jointly won the Nobel Prize in Chemistry this year, demonstrate the rise of the power of computational and AI tools to assist scientists towards greater inventions. Together, they have opened new avenues for innumerable applications.

The timelines have been compressed drastically with the AlphaFold. Designing proteins for different functions ranging from medicines to molecules that catalyze difficult reactions, be it capturing methane or carbon dioxide from the atmosphere or helping break down plastics, could lead to sustainable solutions.

However, protein structure, albeit a significant aspect, is not the only one to address real-life problems. To create a new drug, information is required on how a drug interacts with a protein. One needs to understand the behavior of proteins in the more complex environment of the living cells. Scientists are already working on tackling these challenges, one step at a time.

References:

Also, Read: Microbial life & the Space industry— Do we have all bases covered?

Cracking the Enigma of Crimes by Nanotechnology with Dr. Shahid Nazir Paracha

You may have heard of Sherlock Holmes. If you don’t know, he is a detective who solves mysterious criminal cases by using exceptional deductive reasoning, observational skills, and scientific knowledge to analyze evidence. Such intellectual characters also exist in real life.

A proud forensic scientist, Dr. Shahid Nazir Paracha, has dedicated more than a decade to advancing forensic science in Pakistan. He has experience in both field investigations and laboratory analysis. Dr. Shahid has contributed to solving cases like homicide, rape, terrorism, personal identifications, etc. He also trained many professionals, leaving an ingrained mark on the forensic landscape of Pakistan.

Dr. Shahid is currently associated with the Department of Forensic Medicine, University of Health Sciences (UHS), Lahore. He is an adjunct faculty member at the Punjab University Law College. He is a special Editor of forensic science and criminology in the Journal of Basic & Clinical Medical Sciences. Before joining UHS, he served the Punjab Forensic Science Agency (PFSA) as a forensic scientist. Dr. Shahid’s expertise outspreads conventional forensics. In the book “Modeling and Simulation of Functional Nanomaterials for Forensic Investigation”, published in 2023, he discovers the use of nanotechnology in enhancing forensic precision and efficiency.

Forensic Science expert
Dr. Shahid is currently associated with the Department of Forensic Medicine, at UHS Lahore.

In an insightful conversation, Dr. Shahid discussed his journey to forensic science, integrating forensic science and nanotechnology, and Pakistan’s position in adopting these changes. Here are some snippets from our engaging conversation.

Hifz: Thank you for sparing time from your hectic schedule. We will start with your journey, what inspired you to delve into the world of forensic science?

Dr. Shahid: This is quite tricky, and I came to this field accidentally. In 2011, we were the first and pioneer batch of MPhil Forensic Science in Pakistan. Before this, separate subjects were available, like Forensic DNA and Forensic Chemistry.

At that time PFSA was in the pre-operational phase and former Director General Dr. Muhammad Ashraf Tahir, an expert in this field came from the USA to establish PFSA in Lahore. With their help, the University of Veterinary and Animal Sciences (UVAS) Lahore, plans the first-ever program in Pakistan’s history in the forensic sciences.

Luckily, Dr. Ashraf Tahir supervised me for my MPhil project. Soon, after completing my MPhil, I got a job here at PFSA, and definitely, as we delve into real cases, I understand the strength of forensic science. I went to practical and perspective forensics or you can say that for the fascination of forensics, this is the science for justice. We perform hundreds of cases and are satisfied that with our help someone is getting justice.

In the meantime, along with the job, I continued my PhD in Forensic Science and was the first batch of PhD Forensic Science at UHS in 2015. I joined UHS as a full-time employee in 2017. I have been honored and privileged and this is all dedicated to my supervisors, and teachers who guided me this far.

Punjab Forensic Science Agency Laboratories, Lahore (Credits: PFSA)
Punjab Forensic Science Agency Laboratories, Lahore (Credits: PFSA)

Hifz: As you mentioned forensic science in Pakistan is traced back to 2011, and you are one of the country’s early forensic experts. What exactly is forensic science? What is Pakistan’s current position in the forensic world?

Dr. Shahid: Forensic science is a multidisciplinary field that applies scientific methods and principles to investigate crimes and provide evidence for legal proceedings. It involves the analysis of physical evidence such as DNA, fingerprints, bloodstains, firearms, and digital evidence, to reconstruct events and link suspects to crime. There are many subdisciplines Forensic Genetics, Forensic Toxicology, Forensic Chemistry, Digital Forensics, Crime Scene Investigations, and Ballistics.

Nanotechnology, Body and evidence marking at the scene of the crime (Credits: Cottonbro studio)
Body and evidence marking at the scene of the crime (Credits: Cottonbro studio)

Pakistan’s law and order and justice conditions are unstable, having said that forensic science is crucial for ensuring justice, maintaining law and order, and reducing crimes in Pakistan.

To counter terrorism and crime control, forensic science is vital. It has the most advanced techniques for solving terrorism cases which without forensic science is not possible. To increase the efficacy of our judicial system in 10-12 years, there is a much need for every crime to have a forensic report.

With the help of forensic science homicide, sexual assault, and drug trafficking cases are being solved. It’s definitely of international standard as we have the privilege that at the PFSA which is a world-renowned and Asia’s biggest lab, where forensic methods are implemented with the standard protocol.

We claim that a report issued here by the PFSA Laboratory cannot be challenged anywhere in the world. Human rights are always neglected in Pakistan, a lot of our people are underprivileged. In the sexual assault cases like Zainab’s murder and there are many examples like this, if there is no forensic DNA technology, it is not possible for law enforcement agencies to reach the suspect and to encounter them.

The challenge is limited financial resources advanced technologies, and, a lack of skilled persons or experts. I already quoted the example that there are only 3-4 PhDs in pure forensic science in Pakistan, causing a technological gap.

Hifz: In recent times, you co-authored a book “Modeling and Simulation of Functional Nanomaterials for Forensic Investigation”. For those unfamiliar with the concept, how would you define nanotechnology and its significance in forensic science?

Dr. Shahid: Nanotechnology is the manipulation and application of material at the nanoscale level, typically between 1-100 nanometers. At this scale, materials exhibit unique physical, chemical, and biological properties. Nanotechnology is not a very advanced field or has a current past. In different fields, nanotechnology is being used like in chemistry.

With the help of nanomaterials, nanocomposites, or nanoparticles, materials are used to delve into the nanoscale typically 1-100 nanometers. We adopted nanomaterials that can be utilized and helpful in forensics.

For example, for fingerprint detection nanoparticle powders, normally we use simple dust or black powder which are magnetic-based or chemical-based. Nanoparticles like gold, silver, and zinc oxide enhance the visibility of latent prints and they are definitely environment friendly‑ they have very high sensitivity and specific results.

Latent fingerprint development and lifting kit. (Credits: Carolina Biological Supply)

In the DNA analysis, we use an expensive method for the STR analysis. Research is going on DNA analysis and how to obtain purified DNA from small and degraded samples more effectively than traditional methods, with the help of nanomaterials. Nanotechnology applies to drugs, toxins, biological evidence, trace evidence, explosives, and fire debris analysis as well.

The advantages of nanoparticles in forensics are that they are highly sensitive and specific. They provide rapid analysis with rapid test kits which are portable and can bring to the crime scene.

We don’t have a method to collect DNA and run Polymerase Chain Reaction (PCR) at the crime scene, but with the help of nanoparticles, portable devices will be available in the future. It enhances accuracy in the results and reduces the risk of contamination and false positives.

Hifz: A couple of weeks ago, I reviewed the literature regarding Chemical Terrorism and the Role of forensic science. In incidents of terrorism, detecting explosives in an open environment. Could you explain how nano-sensors enhance sensitivity and accuracy in this domain?

Dr. Shahid: Nanomaterials have revolutionized explosive detection by offering unparalleled sensitivity, specificity, and speed in identifying explosive compounds. Their unique property such as high surface area, and conductivity, makes them ideal for detecting trace amounts of explosives in complex environments like humid environments, or any other environment.

Nanomaterials can be used in explosives detection. Explosives release Volatile Organic Compounds (VOCs) that can interact with nanomaterials causing measurable changes in electrical, mechanical, and optical properties in the forms of color, electric, and magnetic current change.

Nanostructures or sensors are metal oxide particles, they are carbon nanotubes (CNTs), and graphene-based sensors. These are sensors and tubes available to detect changes in the organic compound when interacting with nanoparticles. The release of compounds causes measurable changes in electrical, mechanical, and optical properties.

Several types of nanoparticles involved in the detection include quantum dots, and metallic organic frameworks (MOF). They emit fluorescence when they react with the different types of explosive vapors in humid or other environments.

Nanoparticles or nanosensors for field detection use portable devices. Nanotechnology on-site detection lab-on-chip device, is our project and I can share that lab-on-chip are different types of chips coded with different types of quantum dots and nanomaterials. We can bring them to crime scenes and detect explosives rapidly in the field.

As far as chemical terrorism is concerned, nanosensors enhance sensitivity to the high surface-to-volume ratio and enhance electric properties. They have tailored functions and quantum effects with different surfaces. They have improved accuracy compared to the other chemical methods that can change color and detect the presence of explosive materials.

Nanosensors are designed to target specific molecules significantly for explosives. They have integrated with pattern recognition systems and environment-inhibited ability with rapid signal response. But if the environment is rough like in rain or moisture, the chemical method may have some problems but with nanomaterials, this is the most accurate and sensitive detection of trace explosives.

Hifz: Biological molecules and body fluids, these molecules and particles are very sensitive and present in trace amounts at the crime scene. Crime scene investigators usually take very much precaution while lifting them from the crime scene due to the fear of contamination. So how can we employ nanomaterials in detecting biological materials like DNA? Is it possible to use similar approaches for body fluids?

Dr. Shahid: Nanomaterials have shown tremendous potential in the detection of biological materials such as DNA and body fluids due to their unique properties including high surface area, and biocompatibility to functionalize for specific molecular interactions. They enable highly sensitive, rapid, and specific detection techniques. Which are particularly available in forensic science, medical diagnosis, and environmental monitoring.

Nnaotechnology, The trace amount of blood (source of DNA) at the crime scene (Credits: Cottonbro studio)
The trace amount of blood (source of DNA) at the crime scene (Credits: Cottonbro studio)

Currently, we use different chemical methods based on strips and color change methods or chromatographic techniques with their limit and challenges. Nanoparticles can interact with DNA through various mechanisms like adsorption, and hybridization. This interaction is used to detect, quantify, and analyze DNA Sequence.

The approaches we use for nanoparticle detection are based on some costly methods. These are gold nanomaterials functionalized with complementary DNA probes and their optical property with hybridization occur for the calorimetric detection of DNA sequences. Graphite and Graphene Oxide have a strong affinity for single-stranded DNA (ssDNA).

Body Fluids like blood, semen, and saliva interact with specific biomarkers or chemical components. There are calorimetric and fluorescence sensors, surface-enhanced Raman Spectroscopy, which is the major technique in nanoparticles, graphite sensors, magnetic nanoparticles, and nanostructure biosensors. These are all for biological detection.

We use nanoparticles for biological detection for a speedy result. Which are portable and multiplexing. They can be used for enhanced stimulus detection of multiple targets such as the detection of different DNA sequences. There are some challenges like the high costs of gold nanoparticles but due to their stability and standardization, scientists are working to reduce their high cost.

Hifz: What are the most promising advancements in nanotechnology that will revolutionize forensic science?

Dr. Shahid: Definitely, Hifz, Nanotechnology is paving the way for groundbreaking advancement in forensic science by offering innovative tools and techniques that improve sensitivity. The advanced nana sensors are impactful and different types of portable surface-enhanced Raman spectroscopy devices, are internationally available worldwide.

For the enhancement of DNA analysis, magnetic nanoparticles are available for the isolation and purification of the complex samples, they obtain high-quality DNA. Gold nanoparticles are also an example. Nanoparticles for latent fingerprints are quantum dots, and metallic nanoparticles like gold, silver, magnesium, and other particles to enhance the fingerprints compared to the powder technique. They have a different impact and reveal fingerprints under UV light with high clarity; and nanoparticle luminescence.

For the body fluids like blood, semen, and saliva, there are biomarkers and advanced techniques available, like Lab-On-Chip. This is a different nanomaterial or nanosensor fitted on a simple, small chip, that is used on crime scenes for analysis. For drugs and toxins detection different types of nanoaerosol functionalized with molecular printed polymer to capacity and identify a specific substance are available.

Nanotechnology in crime scene reconstruction enhances the microscopic evidence nanotechnology. Nanoparticles for 3D imaging and mapping are also available. For microscopic trace evidence like Gunshot Residue (GSR), we can analyze GSR on a Scanning Electron Microscope (SEM). Fiber and Glass fragment analysis can be done by using nanotechnology.

Similarly, in forensic toxicology, nanomaterials improved the detection and quantification of toxic substances in biological samples especially. Nano Lab-on-Chip systems are being used for analyzing the blood, urine, and tissue samples. This is the faster and more accurate toxicological analysis and a cheap method as compared to High-Performance Liquid Chromatography (HPLC).

Hifz: Do you have any final thoughts or a message for young researchers aspiring to excel in this field?

Dr. Shahid: I think your batch is 1st undergraduate batch in Pakistan. In youngsters like you and your fellows, there is very much the craze of joining forensic science.

Forensic science is a multidisciplinary field, the undergraduate curriculum, courses are from physical, biological, and chemical science. I recommend a person who has an interest in a multidisciplinary field, can stay curious and resilient, he can adopt forensic science. Every piece of evidence tells a story— a jigsaw puzzle, and as forensic scientists, we work hard to solve the puzzle, complete the story, and reach the suspect.

Forensic science is all about ethics, this is not a common science where any mistake leads us to a simple report change. Here any mistake, corruption, or unethical practice leads to someone’s death. We must practice truth and ethical standards in forensic science.

Also Read: The Fundamentals and Applications of the Blockchain World with Boone Bergsma