Search This Site

Tuesday 13 December 2011

Higgs Signal Seen at CERN

CERN researchers speaking at the seminar (happening now - watch at http://webcast.web.cern.ch/webcast/) have reported on a signal recorded by the ATLAS experiment that is indicative of the Higgs boson.

The results appear to show a Higgs particle decaying into two photons.

The signal is far bigger than the calculated background signal (i.e. it's very unlikely to have occurred purely by coincidence). This is the clearest evidence so far of a Higgs particle.

The signal has been detected at an energy of 126 GeV - a plausible estimate of the Higgs mass. Further results from ATLAS (currently being discussed) show a signal at a similar energy for the decay of the Higgs boson into leptons. This second piece of evidence backs up the significance of the strong 126 GeV two-photon-decay signal

Stay tuned to find out how conclusive the findings are. The big question is whether the CMS experiment has observed the same signals.

What is the Higgs?


A simple but charming explanation of the basic principle of the Higgs theory.

LHC Higgs Results Seminar 13/12/11

Researchers at the Large Hadron Collider at CERN will hold a special seminar today, at which they are expected to discuss new evidence regarding the existence of the Higgs Boson.

Will this be the confirmation the scientific community has been waiting for since Peter Higgs first came up with the iconic Higgs theory to explain how particles get their mass? Or just another false alarm?

Watch this space for further news.

Tuesday 13 September 2011

New (Earth-like?) Planet Found

Vela Constellation
Credit: IAU and Sky & Telescope magazine
A new exoplanet, labelled HD85512b,  has been found in the "Goldilocks Zone" around a star in the Vela constellation.

The "Goldilocks Zone" refers to the region around a star whose distance from the star means that the temperature is in the correct range for liquid water to be present. Our own planet, of course, inhabits this zone. If we are hoping to find another Earth-like planet out there, the "Goldilocks Zone" is the place to look.

The new exoplanet, HD85512b, is thought to be slightly warmer than Earth - around 30 to 50 degrees celsius, and very humid. For it to retain liquid water on its surface, and therefore be habitable, it would need to have at least 50 % cloud cover. It is not yet known whether this is the case. However, out of the 562 exoplanets discovered so far, HD85512b appears to stand the best chance of being habitable.

arXiV: 1108.3561

Read more about the search for exoplanets

Wednesday 31 August 2011

Controlling the Weather with Laser Beams

Researchers at the University of Geneva have developed a technique for making rain using a laser beam. Field experiments have been carried out close to Lake Geneva, with encouraging results.

So far, the technology has succeeded in creating tiny droplets of water, less than 0.01 millimeters in diameter. For the particles to be heavy enough fall as rain, they would have to be at least one hundred times bigger. However, it may be possible to induce these tiny droplets in water that is headed towards a natural feature such as a mountain range. As the air is forced to rise over the mountain range, it will cool, causing the droplets to grow larger until they are eventually heavy enough to fall as rain.

So how does it work? The laser beam causes particles of nitric acid to form in the cloud. Nitric acid forms bonds with water vapour in the air, acting as a means of bringing water molecules together so that condensation into liquid water takes place. The nitric acid-bonded water droplets have increased stability, so are less likely to re-evaporate than naturally formed droplets.

To read more about the significance of the technology, and the old silver iodide method that it could potentially replace, follow this link:

Making Rain With Laser Beams

*Image courtesy of Flickr user openuser

Wednesday 10 August 2011

Opportunity Mars Rover reaches Endeavour Crater

After 3 years of slowly crawling across the bleak Martian surface, the NASA Mars rover Opportunity has reached the rim of the Endeavour crater.

West rim of the Endeavour crater photographed by the Opportunity rover.
Courtesy NASA/JPL-Caltech.

Opportunity has already visited Victoria Crater and studied the layers of rock that lie within. Endeavour is 25 times wider and much deeper than Victoria, which means that Opportunity will be able to study older Martian rocks during its time here. Scientists hope that studying rocks that were formed during an earlier era of Mars' history will answer questions about the presence of water on the Red Planet.

Just last week, strong evidence emerged for the existence of flowing water on Mars. The aim of the Martian rover's rock analysis is to determine whether liquid water has ever existed for long periods of time on the surface of Mars.

NASA press release

Friday 5 August 2011

NASA's Jupiter-Bound "Juno" Mission Launches Successfully

Artist's Impression of Juno circling Jupiter.
Courtesy NASA/JPL-Caltech.
Just a few minutes ago, NASA successfully launched the unmanned spacecraft Juno into Space. The solar-powered spacecraft will spend the next five years travelling to its destination, the gas giant Jupiter - the largest planet in our solar system.

Juno's aim is to study Jupiter's atmosphere and its gravitational and magnetic fields. The hope is that this data will give clues to Jupiter's origins. For example, if Juno detects a high concentration of water in Jupiter's atmosphere, that would suggest that Jupiter might have formed further out in the Solar System and gradually drifted to its current position.

Juno's year-long study will end in dramatic fashion, with the probe plunging deep into the depths of Jupiter and being destroyed.

See the Juno mission page from NASA: http://www.nasa.gov/mission_pages/juno/main/index.html

The Code: Mathematics in Nature

The BBC seems to have an affinity for mathematics documentaries at the moment. As well as re-airing The Story of Maths, they are also currently running a three-part documentary, The Code.

The programme highlights of mathematics popping up in unlikely places in nature. For example, one particular species of cicadas emerges only once every 13 years, in order to minimise their chances of coinciding with another species. This works because 13 is prime and therefore shares few factors (only 1 and 13) with other numbers.

There's also a "treasure hunt" based on clues from the TV show, the online games and the blog: Treasure Hunt

Catch the final episode on Tuesday 10th at 9pm, or view the whole series on BBC iPlayer.

Monday 1 August 2011

How to Bring Your Rocket Up to Speed

I just started writing for Brighthub, a site which provides information on science, technology and education.  I write in the Space Channel.

Check out my first article:  How is Calculus Used in Astronomy?

Sunday 24 July 2011

Space Shuttle Retirements: The End of an Era

The last of NASA's space shuttles, Atlantis, has gone into retirement. The shuttle touched down on July 21st at the Kennedy Space Centre, delivering its four crew members safely back to Earth.

The space shuttle program has faithfully delivered astronauts, supplies and maintenance to the International Space Station since 1981. The space shuttle Discovery launched the Hubble Space Telescope and the shuttles have since made several missions to repair and install new instruments in the telescope.

The fleet of shuttles that operated in the program – Enterprise, Columbia, Challenger, Discovery, Atlantis and Endeavour – were revolutionary in that they were reusable. Earlier spacecraft such as Apollo, which carried Neil Armstrong, Buzz Aldrin and Michael Collins to the Moon in 1969, returned to Earth by splashing down in the ocean, but the winged space shuttles could glide majestically down to land on a runway.

Atlantis lands for the final time.  Source: NASA

The decision to retire the fleet of shuttles was made in 2004, following the deaths of the crew on board the space shuttle Columbia, which broke apart on re-entry into the atmosphere. A previous accident, in which the Challenger shuttle broke apart just 73 seconds after take-off in 1986, had already claimed the lives of 7 NASA astronauts.

The average cost of launching a space shuttle into orbit is around $450 million. In 2005, NASA spent almost 30% of its total budget on the space shuttle program. The decision to close down the program will mean job losses for around 3,000 NASA employees.

Until plans for a replacement for the space shuttle program are drawn up, astronauts will be transported to and from the International Space Station by Russian spacecraft. There are reports that a private company - either Orbital Sciences, Lockheed Martin or Boeing - will step in to fill the gap of providing a reusable spacecraft to replace the retired shuttles.

For more information, don't miss this program airing tonight at 9 pm on BBC2.

Saturday 23 July 2011

Breaking News: First Hints of Higgs Boson?

On 22 July, results indicating the presence of a Higgs boson with a mass in the range 120-140 GeV were reported at the EPS-HEP11 conference in Grenoble, by two teams of researchers working independently at the LHC.

The first results to be reported were from the ATLAS experiment, followed by (weaker) results from CMS.  Matt Strassler's blog appears to have been one of the first places to break the news following the conference proceedings.  The ATLAS team state the significance of their result as being 2.8 sigma, although this is before the "look elsewhere" effect is taken into account.  Once that is included, the probability of the peak in ATLAS's data being due to random statistical fluctuation rises to around 8%.

For a discussion of uncertainties and significance in the results, see:
How Certain is Certain?

The CMS experiment also sees a signal in the 120-145 GeV, albeit a smaller one than ATLAS.

Click here for more information about how the Higgs search is carried out at LHC

The Higgs Boson (also known as the "God Particle") was proposed by Peter Higgs in 1964 as part of the Higgs mechanism, which attempts to explain why particles have mass.  Its discovery is one of the primary goals of the Large Hadron Collider, the 7.5 billion euro particle collider that lies in a 27 km circular tunnel underground near Geneva.


To find out more about the Higgs boson and the new results, please see:

These results are exciting because they both show evidence of a Higgs boson in the expected mass range.  However, more data is required before we can say for certain that we have seen the Higgs particle.

---

Update:
CMS results available here: http://cms.web.cern.ch/cms/News/2011/EPS_2011/index.html (technical)

Saturday 16 July 2011

BBC series: The Story of Maths

The BBC is re-airing its 2008 series, "The Story of Maths", presented by Marcus du Sautoy.

Catch the first episode "The Language of the Universe" on iplayer here (UK only)
The second episode, "The Genius of the East" will air on BBC 4 on Tuesday 19th July at 8pm.

The first hour-long documentary traces the origins of mathematics, detailing the contributions made by the Egyptians, Babylonians and Greeks.  It tells the story of how numbers such as pi, zero, irrational numbers and the Golden Ratio came to play such an important role in mathematics.  Episode Two will move east to China, and the final two episodes will trace the story of maths up to the present day.

I often feel patronised by science documentaries, but I'm actually learning things from this series, which would seems like it would also be very accessible to non-mathematicians.  Even though I'm already familiar with the theorems, learning about their origins and how ancient people came to realise them gives fresh food for thought.  One of the things I love most about maths is how every problem can be approached using a range of different methods, so it's interesting to see how ancient cultures arrived at the same mathematical truths as modern mathematicians, often while seeking solutions to very different problems.

The Story of Math on DVD


Tuesday 5 July 2011

Tevatron finds CP-violation at 3.9 sigma: could this be mechanism for matter-antimatter asymmetry?

Matter-antimatter asymmetry, as I wrote in this blog post and in this introductory article, is one of the great unsolved mysteries of the Universe. The problem in simple terms is that there is an excess of matter over antimatter in the Universe, and scientists and astronomers are baffled as to why. One would naively expect the Universe to behave symmetrically, treating matter and antimatter exactly the same rather than preferring one to the other. After all, symmetry and conservation laws play a huge part in physics.

However, physicists have increasingly found that this is not the case: some processes in particle physics produce more matter than antimatter, an effect known as baryogenesis. These processes violate CP symmetry: a combination of charge conjugation symmetry and parity symmetry.

Click here for an introduction to CP violation as a mechanism for baryogenesis.

Following the discovery of interactions that violate CP symmetry in 1964, the Standard Model of particle physics was modified to take account of CP violation, by adding a complex phase into the CKM matrix describing quark mixing. However, the maximum amount of CP violation that can be included in the Standard Model in this way is still much much too small to account for the observed preponderance of matter over antimatter in the Universe. To put it in perspective, even if the parameters of the Standard Model were adjusted so as to give the maximum possible amount of CP violation, it would still only account for an excess of matter roughly equal in size to one galaxy - and there are millions of galaxies in the Universe!

Recent experiments provide evidence that the amount of CP violation observed in nature is greater than the amount that is allowed by the Standard Model. Researchers at the Tevatron* have measured the dimuon charge asymmetry – the number of muons compared to antimuons that are produced in a particular reaction – and found that more muons are produced than antimuons. The amount by which muon production exceeds antimuon production is larger than that predicted by the standard model.

The researchers have quoted the disagreement between their results and the standard model as 3.9 sigma – this means that if the Standard Model prediction is correct, there is a 0.005% probability of obtaining the result they did. When an experiment gives a result that the standard theory says should only occur 0.005% of the time, it becomes sensible to ask whether the standard theory might be wrong. However, with thousands of particle physics experiments currently taking place around the world, one would expect to see a few anomalous results occurring, even if there was nothing wrong with the underlying theory. For this reason, the convention in particle physics is to disbelieve the current theory only if the level of disagreement between the theory and the results is 5 sigma – i.e. if the theory predicts that the observed result will occur 0.00003% of the time.

In conclusion, this experiment has provided strong evidence that there may be more CP-violation occurring in the Universe than current Standard Model particle physics can explain. This CP-violation could be the mechanism by which the early Universe produced more matter than antimatter. If these results can be confirmed at the desired 5 sigma level of accuracy, they provide further motivation to develop a theory of particle physics that goes beyond the Standard Model.

*http://www-d0.fnal.gov/Run2Physics/WWW/results/final/B/B11B/B11B.pdf

Saturday 25 June 2011

Clumpiness of Distant Universe calls Standard Model of Cosmology and General Relativity into Question

A map of the most distant parts of the Universe – those objects that are at least 4 billion light years away – has been constructed using data from the Sloan Digital Sky Survey. Researchers have calculated how smoothly or clumpy the distribution of galaxies is at this scale, and have come to a surprising conclusion: the Universe is much more clumpy than models predict.

Matter in the present-day Universe is obviously arranged into clumps - stars, galaxies and galaxy clusters. But one would expect that the Universe originally started off in a state of relative smoothness. Therefore, one would expect to observe smoothness in those parts of the Universe which are many billions of light years away, since we observe those distant parts in the state that they were in billions of years ago due to the time it has taken for the light from them to reach us.

The very early Universe is expected to have been smooth because the Big Bang theory says that the Universe at that time was very small, so that light signals and matter could be exchanged between regions without much time delay. Therefore variations in temperature and density would have been smoothed out by exchange of heat energy and matter.

During inflation, the Universe expanded rapidly, with distances between some points increasing faster than light could cross them. These regions therefore became isolated from each other, no longer able to exchange matter and energy. Any fluctuations in the density of the Universe at this time became locked in, fixed for all eternity.

Over time, small peaks in the matter density became more pronounced, as the denser regions exerted a gravitational pull to draw further matter towards them. This is how objects such as stars, galaxies, and huge galaxy clusters formed, leaving huge voids of empty space between them.

The problem is that standard models of cosmology predict that the distant regions that have recently been mapped would have a clumpiness that varies only by about 1% over a length scale of 2 billion light scales. The observed variation is almost double that.

This means either that there is something wrong with the cosmological models used to make the predictions - possibly the assumptions about the distribution of dark energy, or the existence of dark energy at all - or that Einstein's theory of general relativity doesn't work on such large scales. Despite the success of general relativity at predicting the movements of planets, it could be that it is only an approximation to the true theory of gravity, in the same way that Newton's theory of gravity is only an approximation to general relativity.

There is of course a third possibility, which is that the unexpected result is actually due to systematic errors in the data, such as dust in our own galaxy blocking the view of more distant objects, or nearby stars being misidentified as distant galaxies. A further study, using data that the Dark Energy Survey will begin collecting in October 2011, will hopefully confirm or rule out this possibility.

http://prl.aps.org/abstract/PRL/v106/i24/e241301

Thursday 16 June 2011

Single photon experiment supports speed of light limit

Einstein's theory of relativity tells us that nothing can travel faster than the speed of light - 30,000,000 metres per second. Any particle travelling faster than this cosmic speed limit would violate causality, leading to effects happening before their causes, which creates paradoxes.

So it came as a surprise when it was discovered that in certain materials light travels faster than the limit of 30,000,000 metres per second. The finding, reported here in Nature, raised concerns about whether it meant that information could be transmitted could be transmitted at greater than light speed - which, if true, would lead to the disturbing conclusion that it could be possible to send messages into the past.

Fortunately, a group of researchers at Hong Kong University have carried out the experiment using single photons, and shown that even though the group velocity of the wave is greater than c, individual photons travel at the usual speed of light in vacuum. This means that no information is transmitted at faster than light speed and causality is preserved.

Thursday 26 May 2011

Electron's spherical shape puts squeeze on supersymmetry

Researchers at Imperial College London have measured the electric field surrounding an electron, and determined that it is almost perfectly spherical. The team placed an upper limit of one part in 1018 on the deviation from a perfect sphere. Their result is consistent with zero asymmetry.

The measurement was made using dipolar molecules of ytterbium fluoride which are placed in a magnetic field.  The rate of spin is measured as the fields are varied. A variation in the rate of spin would indicate a slightly oval-shaped electron. The full findings were reported in Nature (doi:10.1038/nature10104)

The standard model predicts an electron which is very slightly egg-shaped: to one part in 1028. Supersymmetry predicts a slightly larger deviation: between one part in 1014 and one part in 1019. A factor of ten improvement in the accuracy of the measurement could either confirm or disprove supersymmetry.

"We cannot rule out supersymmetry but we're certainly putting pressure on the theory," says John Hudson from the Imperial team.


The search for supersymmetry, a theory which has been proposed as a solution to the heirachy problem, is closing in, but with still no evidence that it exists. With the LHC still reporting no sign of supersymmetry, could this elegant theory be nearing the end of the road?

Sunday 27 February 2011

Dark matter theory challenged by gassy galaxies result, LHC sees no SUSY (yet)

An old friend just sent me this article which discusses new results from gas-rich galaxies in favour of Modified Newtonian Dynamics theory (MOND).
BBC News - Dark matter theory challenged by gassy galaxies result

This coincides nicely with this article I was reading about how the LHC has so far failed to find supersymmetric particles ("sparticles") at the electroweak scale. These particles were thought to be the most likely candidates for dark matter.

Could this be the end for dark matter theory? Well, no. Certainly not at this stage. The author of the gassy galaxies study, Stacy McGraugh, admits that MOND still produces poor results on the scale of galactic clusters. And the LHC results are still a long way from ruling out supersymmetry. It could be that the particles have not been seen because their masses are beyond the range the ATLAS experiment has been looking at - although SUSY models do begin to get complicated if the sparticles involved are very heavy.

It will be interesting to see what results emerge from the LHC in 2011. The collider began to reawaken from its winter shutdown on Feb 19th, when the particle beams started circulating again. The number of collisions is to be stepped up this year, with 100 times more data expected to be collected in 2011 than in 2010.

Tuesday 22 February 2011

Cycles of Time: An Extraordinary New View of the Universe (Review)

When I came across Roger Penrose's new book, Cycles of Time: An Extraordinary New View of the Universe, I just had to read it, having twice heard him speak on the slightly wacky topic of Conformal Cyclical Cosmology (CCC).  CCC is the idea that the Universe may have undergone several cycles and proposed the controversial idea that information about the time before the Big Bang may still be present in the Cosmic Microwave Background today.  In both of the talks I heard him give - about a year apart - he ended with the tantalising promise that observational evidence testing CCC theory could be just around the corner.

Perhaps the best introduction I can give to this book is to link to a recording of Penrose speaking on the subject.  This talk is from 2005, several years earlier than the talks I heard him give in which he promised observational results.  Alas, I do not have recordings of either of those.

The book expands upon the ideas that Penrose speaks about and is able to present them at greater length, which makes it easier to follow than the talks.  Mathematical details are included, but mostly kept confined to detailed appendices, with references for the reader who desires to pursue the topic in depth.

Penrose's bold idea brings together several topics in physics: entropy, black holes, particle physics and cosmology, all of which are explored in the book.  He closes by explaining exactly how cosmological observations could provide a physical test for his theory.  As I spend my days in an environment in which people play with toy mathematical models or hammer out the details of string theory without too much concern for the real world, I find it refreshing to see a theorist giving serious thought to the testability of his theory.

Monday 21 February 2011

Theoretical Physics Seminars and Lectures Online

Over the weekend I came across some very helpful resources for keeping up with what's going on in the world of physics research, and for learning physics.

I love listening to physics talks and seminars, but hate the whole experience of going to the department and being crammed into a room with other researchers. Also, how many times have you lost concentration for 30 seconds whilst listening to a talk, only to tune back in to find that you've missed a vital piece of information or step in the reasoning, which makes the rest of the talk impossible to follow?  Wouldn't it be great to be able to pause or rewind the speaker?

There are a few institutions who helpfully publish their seminar series online:

This is not a comprehensive list of institutions, rather a list of those that I found which have significant amounts of up-to-date and interesting material.  I will be updating the list as I find more sources.

Please feel free to add more links in the comments!

Wednesday 9 February 2011

The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos (Book Review)

The Hidden Reality: Parallel Universes and the Deep Laws of the CosmosBrian Greene, bestselling author of The Elegant Universe, has recently released a new book, The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos, in which he sets out the arguments for the theory that the universe we inhabit may in fact be just one of many universes.

The concept of this multitude of universes, or “multiverse”, arises in response to many puzzling topics in physics. It has been invoked to explain the apparent fine-tuning of physical constants, as an interpretation of the uncertain nature of quantum mechanics, and as the “braneworld” scenario of string theory - in which our universe is a three-dimensional membrane floating in a higher-dimensional space of alternate universes.

Greene tackles each of these in turn, starting in Chapter Two with the simplest reasoning behind the belief in parallel worlds: if the universe is infinite in extent, it is inevitable that one of the infinite number of planets out there is identical to our own, and that on it there are people exactly like ourselves undergoing an alternate version of our reality. Greene discusses the question of whether or not the physical size of our universe is infinite or finite, which is still very much an open question in modern physics.

Greene, as in his previous books, uses metaphor to make difficult concepts accessible to the general reader. Expect to have to take your time to wrap your brain around the subject matter, but what Greene won't do is blind you with technical jargon or assume specialist knowledge.

I have been a fan of Brian Greene since I read The Elegant Universe at seventeen. If I had to pinpoint a deciding factor in my decision to study theoretical physics, this book would probably be it. This was the book that got me hooked on strings, particles and the quest for a “theory of everything”. It was fantastic to read the author's recent follow-up and have those feelings of curiosity and excitement resparked.


Monday 7 February 2011

Dark Matter Due for Discovery?

The Large Hadron Collider (LHC) at CERN will be up and running again soon following its winter shutdown. As well as seeking the Higgs Boson, scientists are also hoping that the accelerator will provide insights into the nature of dark matter.

In my last post, Torsion and the Matter-Antimatter Asymmetry, I discussed a recent article which concluded that dark matter might consist of antiparticles left over from the Big Bang.  But this is only one of many theories.  What follows is an introduction to dark matter and a brief summary of the theories that have been proposed to answer to the question: what is dark matter?

Introduction to Dark Matter

The concept of “dark matter” was introduced in 1934 by Fritz Zwicky. By making measurements of the motions of galaxies on the edge of a cluster, he calculated that the total mass of the cluster was 400 times greater than the amount that he observed.

The problem was taken up by astronomers Vera Rubin and Kent Ford at the Carnegie Institute during the 60s and 70s. They measured the velocity curve of spiral galaxies and produced results that also seemed to imply the presence of a large amount of non-visible mass.

The Evidence for Dark Matter

Astronomical observations clearly show that most of the stars in a spiral galaxy lie in a rotating circular plane, with the density of stars being much greater towards the centre of the plane than further out. The stars in the galaxy are kept together by the gravitational attraction between them.
F is the gravitational force,
m1 and m2 are the masses

of the two bodies, r is the

distance between them,
and G is a constant

As stated in Newton's famous gravitational law (shown to the right), the gravitational attraction between two bodies decreases as the distance between them increases. This means that, if most of the mass of the galaxy is located in its centre, then a star orbiting at the edge of the galaxy should experience a weaker gravitational pull than a star that is close to the dense galactic core.

The speed at which a star orbits depends on how much gravitational force is exerted on it by the rest of the galaxy: a star orbiting too slowly will gradually spiral in towards the centre, and a star that orbits too fast will spiral outwards as the gravitational force is not strong enough to keep it reined in. Therefore, we would expect stars that are further out to have lower velocities than those close to the centre. This expectation is plotted on the graph below as curve A. However, Rubin and Kent's results (curve B) show that the velocity of the stars is about the same for all distances from the centre.


There are two possible explanations for this. The first is that the theory of Newtonian gravity, despite its success in predicting the orbits of planets in our own solar system, is flawed. It has been suggested that Newton's law of gravity should be modified to better fit the galaxy rotation curves. This hypothesis is known as MOND – Modified Newtonian Dynamics. Although MOND is very successful in predicting galaxy rotation curves (as one would expect, since this is the situation it was designed to model), it is less good at predicting effects at the larger scale of galactic clusters.

The more popular explanation is dark matter theory: the idea that there is more mass in galaxies than what we can see through our telescopes. The problem is that no-one knows what this dark matter actually is. It was first thought that it might consist of non-luminescent objects such as black holes and dead stars such as brown dwarfs or neutron stars, known collectively as MACHOs – Massive Compact Halo Objects. However, it is now thought that the majority of dark matter is in the form of WIMPs – Weakly Interacting Massive Particles.

But what are WIMPs?

Particle physics has suggested many candidates for WIMPs. Any particle that makes up dark matter would have to be stable – i.e. it must not decay into other particles. Neutrinos, which rarely decay or interact with other particles, have been suggested as a possible candidate.

Another possibility is the theoretical Lightest Supersymmetric Particle (LSP). Supersymmetry is a theory of physics which predicts the existence of a much heavier partner to each known particle. The lightest of these super-particles (nicknamed “sparticles”) is expected to be stable so would be a good candidate for dark matter.

Experiments are currently underway at the LHC to try to produce sparticles, and we could see results as soon as this year. Will dark matter finally reveal its identity?

The discovery of supersymmetry would have many exciting consequences for theoretical physics beyond the identification of dark matter... but that's the subject of a future post.

Friday 4 February 2011

Torsion and the Matter-Antimatter Asymmetry

Matter-antimatter asymmetry is a mystery which is (almost) as old as the universe:  Why is the amount of matter in the universe - the stuff that makes up you and I and everything familiar - so much greater than the amount of antimatter?

Many scientists have attempted to come up with an explanation for this disconcerting lack of symmetry.  I give a layman's overview of the problem and list the suggested solutions in this article.  The latest suggestion that I want to discuss here is that a variant of General Relativity involving torsion could be responsible for the apparent asymmetry.

Introduction to the problem
Antiparticles have the same mass and spin properties as their particle siblings, but opposite charge - a fact which led to their discovery in 1932, when Carl Anderson noticed a track in a cloud chamber that looked exactly like the path of an electron except for one detail: the fact that it curves anticlockwise instead of clockwise in the magnetic field of the chamber indicates that the particle responsible for it was positively rather than negatively charged.  The track had been left by a new kind of particle - a positive electron, or "positron" - which had been predicted to exist by Dirac four years prior to Anderson's discovery.

The reason that we had never seen one of these particles before is that positrons are rare: the term matter-antimatter asymmetry refers to the fact that the ratio of electrons to positrons has been observed to be staggeringly large.

The dominance of matter over antimatter was established during the first few seconds after the Big Bang.  During this time, the enormous density of the Universe meant that temperatures were very high and particles and antiparticles bubbled in a hugely energetic plasma, undergoing millions of interactions per second.

In an interaction, particles collide and are destroyed, with new particles being born from the energy liberated by the destruction.  But even this seemingly chaotic process follow rules: one such being that Baryon Number, B, defined as the number of particles minus number of antiparticles, is always conserved.  This means that if before the interaction you had two particles and no antiparticles, then afterwards you must have either two particles and no antiparticles, or three particles and one antiparticle, or any other possibility such that B=2 just as it did before the interaction took place.

The problem currently faced is that the value of B in our universe today is observed to be a very large positive number, and there is no explanation as to why this should be the case.  In a universe displaying an astonishing degree of symmetry, it feels surprising and unnatural to have such a blatant asymmetry built into the universe's initial conditions.

Torsion as a possible solution
A recent paper by Nikodem J. Poplawski of Indiana University is the latest to suggest a solution - not only to the conundrum of matter-antimatter asymmetry, but also to the puzzling nature of dark matter and the origin of dark energy.

The approach is to modify Einstein's theory of General Relativity to include a non-zero torsion tensor.  The torsion tensor expresses the amount of 'twist' which exists in the fabric of space-time.  The action of this term is to interact with the fields in Einstein's equations which represent fermions, causing their masses to change.  It acts unequally on particles and antiparticles, creating an asymmetry.  The result is that particles in the early universe decay into normal matter and antiparticles into dark matter.  Therefore the authors suggest that the overall Baryon number of the Universe is in fact zero - there is exactly the same amount of matter as antimatter - but that the antimatter is present in the form of dark matter, which lurks in great clouds in galaxies, undetectable except by the gravitational effects of its mass.

The authors also claim that torsion leads to the conclusion that the cosmological constant is non-zero.  This means that torsion provides an explanation for the presence of dark energy, the mysterious force which is causing the universe to expand at an ever-increasing rate.

The authors make no quantitative predictions of what ratio of matter to antimatter we could expect their theory to yield, so we cannot compare yet the universe that would result from such a theory to the one we observe in order to test the theory.