Saturday, March 24, 2007

Alternative theory of gravity explains large structure formation -- without dark matter


But the feature of Bekenstein’s theory that Dodelson and Liguori focus on most is that the theory—unlike standard general relativity—allows for fast growth of density perturbations arising from small inhomogeneities during recombination. Building on this finding from scientists Skordis et al. earlier this year, Dodelson and Liguori have found which aspect of the theory actually causes the enhanced growth—the part that may solve the cosmological structure problem.

The pair has discovered that, while Bekenstein’s theory has three functions which characterize space-time—a tensor, vector and scalar (TeVeS)—it’s the perturbations in the vector field that are key to the enhanced growth. General relativity describes space-time with only a tensor (the metric), so it does not include these vector perturbations.

“The vector field solves only the enhanced growth problem,” said Dodelson. “It does so by exploiting a little-known fact about gravity. In our solar system or galaxy, when we attack the problem of gravity, we solve the equation for the Newtonian potential. Actually, there are two potentials that characterize gravity: the one usually called the Newtonian potential and the perturbation to the curvature of space. These two potentials are almost always very nearly equal to one another, so it is not usually necessary to distinguish them.

“In the case of TeVeS, the vector field sources the difference between the two,” he continued. “As it begins to grow, the difference between the two potentials grows as well. This is ultimately what drives the overdense regions to accrete more matter than in standard general relativity. The quite remarkable thing about this growth is that Bekenstein introduced the vector field for his own completely independent reasons. As he remarked to me, ‘Sometimes theories are smarter than their creators.’"

Dodelson and Liguori see this solution to large structure formation as an important step for a gravity theory based on baryon-only matter. Other problems that their theory (or any alternative theory) will have to confront include accounting for the mismatch in galaxy clusters between mass and light. Also, the theory must conform to at least two observations: the galaxy power spectrum on large scales, and the cosmic microwave background fluctuations, which correspond to baby galaxies and galaxy clusters.

“As Scott says, until dark matter will be observed, skeptics will be allowed,” said Liguori. “Despite the many and impressive successes of the dark matter paradigm, which make it very likely to be correct, we still don't have any final and definitive answer. In light of this, it is important to keep an eye open for possible alternative explanations. Even when, after the analysis, alternative theories turn out to be wrong, the result is still important, as it strengthen the evidence for dark matter as the only possible explanation of observations.”

Citation: Dodelson, Scott and Liguori, Michele. “Can Cosmic Structure Form without Dark Matter?” Physical Review Letters 97, 231301 (2006).

By Lisa Zyga, Copyright 2006 PhysOrg.com

Wednesday, March 21, 2007

Loop quantum gravity


Loop quantum gravity (LQG), also known as loop gravity and quantum geometry, is a proposed quantum theory of spacetime which attempts to reconcile the seemingly incompatible theories of quantum mechanics and general relativity. This theory is one of a family of theories called canonical quantum gravity. It was developed in parallel with loop quantization, a rigorous framework for nonperturbative quantization of diffeomorphism-invariant gauge theory. In plain English, this is a quantum theory of gravity in which the very space that all other physics occurs in is quantized.

Loop quantum gravity (LQG) is a proposed theory of spacetime which is constructed with the idea of spacetime quantization via the mathematically rigorous theory of loop quantization. It preserves many of the important features of general relativity, while at the same time employing quantization of both space and time at the Planck scale in the tradition of quantum mechanics.

LQG is not the only theory of quantum gravity. The critics of this theory say that LQG is a theory of gravity and nothing more, though some LQG theorists have tried to show that the theory can describe matter as well. There are other theories of quantum gravity, and a list of them can be found on the quantum gravity page.

Many string theorists believe that it is impossible to quantize gravity in 3+1 dimensions without creating matter and energy artifacts. This is not proven, and it is also unproven that the matter artifacts, predicted by string theory, are not exactly the same as observed matter. Should LQG succeed as a quantum theory of gravity, the known matter fields would have to be incorporated into the theory a posteriori. Lee Smolin, one of the fathers of LQG, has explored the possibility that string theory and LQG are two different approximations to the same ultimate theory.

The main claimed successes of loop quantum gravity are:

1. It is a nonperturbative quantization of 3-space geometry, with quantized area and volume operators.
2. It includes a calculation of the entropy of black holes.
3. It is a viable gravity-only alternative to string theory.

However, these claims are not universally accepted. While many of the core results are rigorous mathematical physics, their physical interpretations remain speculative. LQG may possibly be viable as a refinement of either gravity or geometry. For example, entropy calculated in (2) is for a kind of hole which may, or may not, be a black hole.

Some alternative approaches to quantum gravity, such as spin foam models, are closely related to loop quantum gravity.

MORE AT <http://en.wikipedia.org/wiki/Loop_quantum_gravity>

Tuesday, March 20, 2007

Computers That Run On Light?


An £820,000 research project begins soon which could be an important step in bringing the dream of photonic computers – devices run using light rather than electronics – onto the desktop.

Physicists at the University of Bath will be looking at developing attosecond technology – the ability to send out light in a continuous series of pulses that last only an attosecond, one billion-billionth of a second.

The research could not only develop the important technology of photonics, but could give physicists that chance to look at the world of atomic structure very closely for the first time.

In June Dr Fetah Benabid, of the Department of Physics at Bath, will lead a team of researchers to develop a new technique which would enable them to synthesise ‘waveforms’ using light photons with the same accuracy as electrons are used in electronics. Waveform synthesis is the ability to control very precisely the way that electric fields vary their energy.

Ordinarily, electric fields rise and fall in energy in a regular pattern similar to the troughs and crests of waves on the ocean, but modern electronics allows a close control over the shape of the ‘wave’ – in effect creating waves that are square or triangular or other shapes rather than curved.

It is this control of the variation of the electric field that allows electronic devices such as computers to function in the precise way needed.

But electronics has its limitations, and the development of ever smaller silicon chips which has allowed computers to double in memory size every 18 months or so will come to an end in the next few years because the laws of physics do not permit chips smaller than a certain size.

Instead, engineers are looking to the science of photonics, which uses light to convey information, as a much more powerful alternative. But so far photonics can use light whose waveform is in one shape only – a curve known as a sine wave – and this has limited value for the communications needed to run a computer, for example.

The Bath researchers want to allow photonics to create waveforms in a variety of different patterns. To do this, they are using the new photonic crystal fibres which are a great step forward in photonics because, unlike conventional optical fibres, they can channel light without losing much of its energy.

In the research, light of one wavelength will be passed down a photonic crystal fibre which then branches off in a tree-like arrangement of fibres, each with a slightly separate wavelength, creating a broad ‘comb-like’ spectrum of light from ultra-violet to the middle of the infra-red range.

This broad spectrum would allow close control over the electric field, which is the basis of conveying enormous amounts of information that modern devices like computers need. They are funded by a grant from the Engineering and Physical Sciences Research Council.

“Harnessing optical waves would represent a huge step, perhaps the definitive one, in establishing the photonics era,” said Dr Benabid.

“Since the development of the laser, a major goal in science and technology has been to emulate the breakthroughs of electronics by using optical waves. We feel this project could be a big step in this.

“If successful, the research will be the basis for a revolution in computer power as dramatic as that over the past 50 years."

Dr Benabid said that the technology that could be built if his research was successful could, for instance, make lasers that operate at wavelengths that current technology cannot now create, which would be important for surgery.

The continual series of short bursts of light will not only dramatically affect technology - it will also advance physics by giving researchers the chance to look inside the atom.

Although atoms can now be “seen” using devices such as electron microscopes, it has not been possible to examine their fast dynamics.

By sending the light in short bursts into an atom, they will be able to work out the movements of electrons, the tiny negatively charged particles that orbit the atom’s nucleus.

This may throw light, literally, upon the strange quantum world of sub-atomic particles, which have no definite position, but are only ‘probably’ in one place until observed.

TAKEN FROM <http://www.sciencedaily.com/releases/2007/03/070319114521.htm>

Monday, March 19, 2007

Project Orion


Project Orion was an advanced rocket design explored in the 1960s. Orion is also the name of NASA's new spacecraft for human space exploration, previously known as the Crew Exploration Vehicle, which is designed to replace the Space Shuttle and eventually return to the Moon. For details of the Orion Crew Exploration Vehicle, go here.

The 1960s Project Orion examined the feasibility of building a nuclear-pulse rocket powered by nuclear fission. It was carried out by physicist Theodore Taylor and others over a seven-year period, beginning in 1958, with United States Air Force support. The propulsion system advocated for the Orion spacecraft was based on an idea first put forward by Stanislaw Ulam and Cornelius Everett in a classified paper in 1955. Ulam and Everett suggested releasing atomic bombs behind a spacecraft, followed by disks made of solid propellant. The bombs would explode, vaporizing the material of the disks and converting it into hot plasma. As this plasma rushed out in all directions, some of it would catch up with the spacecraft, impinge upon a pusher plate, and so drive the vehicle forward.

Overshadowed by the Moon race, Orion was forgotten by almost everybody except Dyson and Taylor.1 Dyson reflected that "this is the first time in modern history that a major expansion of human technology has been suppressed for political reasons." In 1968 he wrote a paper2 about nuclear pulse drives and even large starships that might be propelled in this way. But ultimately, the radiation hazard associated with the early ground-launch idea led him to become disillusioned with the idea. Even so, he argued that the most extensive flight program envisaged by Taylor and himself would have added no more than 1% to the atmospheric contamination then (c. 1960) being created by the weapons-testing of the major powers.

Being based on fission fuel, the Orion concept is inherently "dirty" and probably no longer socially acceptable even if used only well away from planetary environments. A much better basis for a nuclear-pulse rocket is nuclear fusion – a possibility first explored in detail by the British Interplanetary Society in the Daedalus project.

TAKEN FROM <http://www.daviddarling.info/encyclopedia/O/OrionProj.html>

Sunday, March 18, 2007

Light goes backwards in time


Physicists today claim that they reached 300 times the speed of light. But don't write off Einstein, and don't hold your breath for a time-travelling Star Trek universe, warns Paul Davies.

On the face of it, today's announcement in Nature that a team of Princeton physicists have broken the light barrier demolishes what is arguably science's most cherished principle.

Ever since Albert Einstein formulated his theory of relativity nearly a century ago, it has been a central tenet of physics that nothing can travel faster than light. Now it is claimed that in certain circumstances, light itself can be accelerated up to 300 times its usual speed. But it's too soon to consign the textbooks to the dustbin. As always, the devil is in the detail.

Moving through a vacuum, light travels at 300,000 km per second. According to the theory of relativity, it is the ultimate speed limit for the propagation of any physical influence. That includes spacecraft, subatomic particles, radio signals, or anything that might convey information or cause an effect.

When light passes through a medium such as air, it is slowed. The effect is best explained by analogy with water waves. Try throwing a stone in a pond to make ripples. Focus on a particular wave crest, and it will appear to move fairly fast, but then take a wider perspective to view the group of waves as a whole, and it travels outwards from the point of disturbance noticeably more slowly. It is almost as if the waves are rushing to get nowhere fast. You can watch as new ripples rise up at the back of the wave group, whiz forwards, and fade away at the front.

The same thing happens to light in a medium. It comes about because atoms in the medium create outgoing ripples of light as the primary light wave sweeps by them. When these ripples overlap and combine with the primary wave, they obliterate the parts racing on ahead, suppressing the fast-moving wave front and serving to slow down the group. So light passing through a medium has two associated velocities: that of the group as a whole, and that of the individual wave crests, known as the phase velocity.

A normal medium always reduces the group velocity of light to below its phase velocity, leading to the familiar phenomenon of refraction - the effect that causes a stick to look bent when it is stuck in water. The special feature of the Princeton experiment was the creation of a peculiar state of matter in which this situation is reversed: the secondary ripples of light actually make the wave group travel faster than the phase velocity.

To achieve this odd state of affairs, the scientists used a gas of cold caesium, and then excited the caesium atoms with a laser. So energised, the atoms do more than cause secondary ripples of light, they amplify the light too. It is this amplification that is the key to boosting the speed of the wave group, reportedly to 300 times the speed of light in a vacuum. Bizarrely, the wave distortion achieved is so large, it causes the group velocity to become negative, which means the peak of the wave pulse appears to exit the gas before it enters. In other words, the light waves seem to run backwards.

What makes this result so sensational is the relationship between light speed and causality. The theory of relativity predicts that speed slows time. For example, time passes a bit slower in an aircraft than on the ground, an effect that has been verified using atomic clocks. The time warp is small for everyday motion, but grows without limit as the speed of light is approached. Cosmic rays, for example, travel exceedingly close to the speed of light, and their internal clocks are slowed millions of times.

Relativity theory predicts that if a particle could exceed the speed of light, the time warp would become negative, and the particle could then travel backwards in time.

As Dr Who fans are aware, travel into the past opens up a nest of paradoxes. For example, suppose a faster-than-light particle is used as a signal to explode a bomb in the very lab that the particle itself is created. If the bomb explodes yesterday, the particle cannot be made today. So the bomb won't explode, and the particle will be made.

Either way, you get contradictory nonsense. At stake, then, is the very rationality and causal order of the universe. Allow faster-than-light travel, and the physical world turns into a madhouse .

Timing the speed of a pulse of light is fraught with complications, not least because the shape of the pulse changes when it passes through a medium. To make a pulse of a short duration, it is necessary to mix together waves of many different frequencies, and in a medium each wave will propagate differently.

As for transmitting information, opinions differ about how to associate it with a pulse that has a complicated, changing shape. The inherent fuzziness in a light pulse made up of many different waves superimposed precludes a clean definition of how fast actual information travels.

The problem is closely related to the quantum nature of light, where each frequency component can be thought of as made up of pho tons that behave in some ways like particles. But photons are subject to Heisenberg's principle, according to which there is an inherent uncertainty in their whereabouts. In the pulses of light used in the experiment, it isn't possible to pick out a given component photon and observe it travelling at superluminal velocity.

The Princeton physicists believe this fundamental fuzziness associated with a finite pulse of waves prevents information from exceeding the speed of light, so in an operational sense the light barrier remains unbroken and the causal order of the cosmos is still safe. It is intriguing to see how the wave nature of light rescues the theory of relativity from paradox.

• Paul Davies is visiting professor of physics at Imperial College London, and author of About Time.

TAKEN FROM <http://www.guardian.co.uk/science/story/0,3605,345099,00.html>

Saturday, March 17, 2007

Gravitational Time Dilation


Gravitational time dilation is manifested in accelerated frames of reference or, by virtue of the equivalence principle, in the gravitational field of massive objects. In more simple terms, clocks which are far from massive bodies (or at higher gravitational potentials) run faster, and clocks close to massive bodies (or at lower gravitational potentials) run slower.

It can also be manifested by any other kind of accelerated reference frame such as a dragster or space shuttle. Spinning objects such as merry-go-rounds and ferris wheels are subjected to gravitation time dilation as an effect of their angular spin.

This is supported by General Relativity due to the equivalence principle that states all accelerated reference frames possess a gravitational field. According to General Relativity, inertial mass and gravitational mass are the same. Not all gravitational fields are "curved" or "spherical", some are flat as in the case of an accelerating dragster or space shuttle. Any kind of g-load contributes to gravitational time dilation.

TAKEN FROM http://en.wikipedia.org/wiki/Gravitational_time_dilation

Friday, March 16, 2007

Castle Bravo


Castle Bravo was the code name given to the first U.S. test of a so-called dry fuel thermonuclear device, detonated on March 1, 1954 at Bikini Atoll, Marshall Islands, by the United States, as the first test of Operation Castle (a longer series of tests of various devices). Unexpected fallout from the detonation—intended to be a secret test—poisoned the crew of Daigo FukuryĆ« Maru ("Lucky Dragon No. 5"), a Japanese fishing boat, and created international concern about atmospheric thermonuclear testing.

The bomb used lithium deuteride fuel for the fusion stage, unlike the cryogenic liquid deuterium used as fuel for the fusion stage of the U.S. first-generation Ivy Mike device. It was therefore the basis for the first practical deliverable hydrogen bomb in the U.S. arsenal. The Soviet Union had previously used lithium deuteride in a nuclear bomb, their Sloika (also known as Alarm Clock) design, but since it was a single-stage weapon, its maximum yield was limited. Like Mike, Bravo used the more advanced Teller-Ulam design for creating a multi-stage thermonuclear device.

It was the most powerful nuclear device ever detonated by the United States, with a yield of 15 megatons. That yield, far exceeding the expected yield of 4 to 8 megatons, combined with other factors to produce the worst radiological accident ever caused by the United States.

Though some 1,000 times more powerful than the atomic bombs which were dropped on Hiroshima and Nagasaki during World War II, it was considerably smaller than the largest nuclear test conducted by the Soviet Union several years later, the ~50 Mt Tsar Bomba.