Is a science geek, currently pursuing some sort of a degree (called a PhD) in Physics at TIFR, Mumbai. An enthusiastic but useless amateur photographer, his most favourite activity is simply lazing around. He is interested in all things interesting and scientific.
A team of Japanese researchers has come up with ceramic lasers that can be triggered every nanosecond, giving powerful blasts that hold their intensity over very short ranges. They are just 9 mm in diameters. They might find use in the unlikeliest of places the engine of a car, sounding the death knell for spark plugs and twitchy car engines.
A moment’s thought might help one describe this idea as innovative’, rather than outlandish’, though, in fact, it is both. Let’s see why.
Lasers are extremely accurate and can deliver ignition pulses, precise in both time and space. They will deliver it at the right spot and also time them as needed. The very low response time will make engines more sensitive to what the driver instructs them to so. This will make a huge difference in the stopping time of a vehicle, when, often, a second or two matters a lot.
Since lasers can be made to point accurately at any place, they can be designed to produce the ignition right in the middle of the chamber, which can then spread out evenly. Spark plugs deliver asymmetric ignition and, thus, some energy is wasted, since the force in the radial direction is not quite balanced. Lasers will eliminate such imbalances and give more push for the same ignition, concentrating all the force to act in the direction that matters.
Extremely important is the ability of lasers to produce very high temperatures within small spaces. This will be able to ignite a fuel-air mixture containing much less fuel. Spark plugs, producing much lesser temperatures, need higher proportion of fuel to ignite. Fuel guzzlers may become a thing of the past.
Lasers will also come with a lower maintenance cost, so in the long run, they’ll save a lot of money for the owner of the car. They are also expected to make significant cuts in carbon emissions of engines.
Leaner, cleaner and fitter, lasers promise to replace spark plugs in engines of not too distant future.
The message is loud and clear: Our planet needs help to sustain us and we need to stand up and be counted. The Earth today faces many problems; global warming is one of the major ones. Other problems include pesticide overuse, unplanned drilling of oil-wells, reckless deforestation, overfishing of certain delicate marine species, denudation of coral reefs and expansion of deserts. People have been fighting against all of these evils, but mostly their acts have been isolated ones, separated from the rest. Earth Day is a concept, which intends to amalgamate all these pro-environmental activities into one.
The concept of Earth Day
The birth of Earth Day took place in the year 1970 and the US Senator Gaylord Nelson was responsible for it. It was organized as a teach-in program happening in thousands of school and college campuses, amongst other places. The challenges were many. The five sectors identified for improvement were the internal combustion engine, pesticide pollution, detergent pollution, aircraft pollution and non-disposable, non-recyclable containers.
In the next forty years, the movement has grown, especially within the United States but also across the world. The challenges have grown more diverse. In view of the increasing influence, the United Nations designated April 22nd as International Mother Earth Day’, or Earth Day’. The Week starting from the 16th is called Earth Week’.
Major Landmarks in the Environmental Movement
Let’s see two major early successes of the movement.
The first success came in the form of the Clean Air Act Amendment in 1970, which required automobile emissions to be cut by 90% by 1975. This involved huge subsidies from the government and also the commercial implementation of various pollution control measures, which were known but not implemented. This brought a drastic reduction in the emission levels, bringing down carbon monoxide levels by 96% in these four decades. The effect of this change can even be seen in the Antarctic snow.
Next was banning of chlorinated hydrocarbons used as pesticides. Principal amongst these was DDT. This was successfully achieved all around the world in 1972, but there are still parts that use DDT both as a pesticide and as an insecticide against mosquito larvae. DDT is effective, but does more harm to the environment than good. It kills off creatures, like earthworms, in the immediate area, and then affects acquatic life in waters in which it is dumped. Bio-magnification through the food chain affects creatures as high up as eagles. Similar action was taken for phosphate-rich detergent pollutants.
Earth Day 2011: Aims, Hopes and Political Will
Earth Day 2011 aims to achieve A Billion Acts of Green‘, a pledge campaign aiming to get a billion people from around the world to pledge their allegiance to the environment. The key word is awareness.
Issues remain. Global warming is a key concern, especially the major influence humans are having in accelerating it.
Political will seems to be the most precious resource across the world, since legislations is the best way to impose regulations on a wide scale, and as Al Gore once put it, Political will is a renewable resource’.
If the Cosmos is the place of all things beautiful and unusual, the Hubble Space telescope (known simply as Hubble’ or HST) is the ultimate eye to see it with. Launched on 20th April, 1990, aboard the Discovery space shuttle by NASA, as the best of the space-based optical telescopes, Hubble has reached out to all.
Hubble’s images have filled the hard disks of active researchers and eager school students alike, and these have endeared the large floating eye in space to millions worldwide. It has captured stunning, but violent galaxy collisions, seen never-before seen nebular formations, glimpsed the merging of galactic black holes and captured the awe-inspiring and data rich alleys of star-forming nurseries, all the while enthralling us and challenging our own perception of the vastness of the universe. In fact, the word Hubble’ today bears more resonance with the telescope rather than the famed astronomer, Edwin Hubble, after whom it is named.
Hubble also happens to be the only telescope that was serviced by astronauts in space. When Hubble started acquiring images, a flaw was found in the positioning of the main mirror. A collective sigh and gasp throughout the astronomy fraternity around the world was followed by a daring and successful mission by NASA technicians, which involved them going into space and correcting the incorrect alignment of the main mirror. Hubble has never looked back since.
Enjoy the brilliant images below (they may take a second to load). Don’t forget to wish a very Happy Birthday to Hubble. To get more, click here.
Astronomers released this brilliant new snap by Hubble in order to celebrate its 21st birthday. (Go to link for a bigger image!)
Hubble is supposed to function till 2014, after which its successor the James Webb Space Telescope is expected to take over.
I expect a few moist eyes when Hubble is finally plunged into the ocean. I know that my eyes will be wet.
It’s atoms now, and not only light. Researchers at ARC Center of Excellence for Quantum Atom Optics, Research School of Physics, ANU, have successfully guided supercooled Helium atoms through an optical guide made of a laser beam. This is the first ever successful at guiding matter waves.
Speckles, Modes and the Rest of the Basics:
When light is guided in an optical fiber, there can be many modes of transmission. These modes interfere and produce a speckle pattern’ on the screen after emerging from the fiber. The light can be adjusted so as to eliminate the speckle, which indicates that the light is in a single mode, or technically, coherent’. Scientists say that the light has the same phase factor’ throughout, which doesn’t vary with time.
There are many other coherent substances that can be made. One of them is known as the Bose-Einstein Condensate (BEC). During the 1920’s, Satyendranath Bose and Albert Einstein worked out the statistics of bosons and showed that, if cooled enough, they can be made to fall into a single giant ground state. In this state, any addition to the number density of the particles makes more particles fall into the ground state. This is, thus, called a Condensate’, appropriately named, Bose-Einstein Condensate’.
BEC is a remarkable state of matter. Thousands of bosons (for example, Helium atoms) can condense and behave like a single super-atom. BEC physics is one of the richest and the present interest is primarily because BEC physics mimics that of superconductors.
The guiding of matter waves
What the team of researchers has achieved is this: They took a bunch of atoms and trapped them. Then, they irradiated this with laser light pointing downwards towards gravity. This produced a speckled pattern.
As Ken Baldwin, one of the team members, reports
We have shown that when atoms in a vacuum chamber are guided inside a laser light beam, they too can create a speckle pattern – an image of which we have captured for the first time.
The atoms were cooled to lower and lower temperatures, until the atoms formed the BEC. Since the BEC is a coherent state, with the lowering of the intensity of the laser light, the speckled pattern suddenly disappeared.
Team leader, Dr. Andrew Truscott, reported that:
The atoms … behaved more like waves than particles, forming a Bose-Einstein condensate (BEC). When the BEC was loaded into the guide, the speckle pattern disappeared, showing that just one mode was being transmitted the single quantum wave.
Looking at the images and by measuring the arrival times of the atoms on the Multi-Channel Plate (MCP), the researchers could differentiate between a speckled, multi-mode transmission and a smooth, single-mode transmission.
Earlier it was only light that could be guided in a wave guide (here, the optical fiber). No longer is that true. This breakthrough demonstrates that it is possible to guide atoms in a BEC state in an optical guide (not glass). This will allow higher precision atom-interferometers.
A team of researchers at the University of Pittsburgh, collaborated with another team from the University of Wisconsin at Madison, to create a transistor, which is just an atomic diameter in size and can be switched on and off by just one or two electrons. These transistors can further be used as solid state devices, such as fast quantum processors (which might replace the current Si processors) and extremely dense memory devices.
Lead researcher Prof. Jeremy Levy of University of Pittsburgh further emphasized that these new materials might be used to create substances like high temperature super-conductors.
The Device: A Single Electron Transistor
The device is basically an island, 1.5 nanometers (nm) in diameter, which is made of metal oxide. This island can house zero, one or two electrons only allowing it to be in very specific quantum states. Such exotic materials have never been made. Nanowires, 1 to 1.2 nm thick, carry electrons across the island, thus allowing conduction.
Existent Single Electron Transistors (SET) generally have a size of about a micron, so the miniaturization is a thousand-fold.
The idea has been there for a long time. In order to build an SET, Graphene has also been tried. No real success has been achieved so far, primarily because Graphene doesn’t have a strict off state. This becomes crucial when considering transistors switched on and off by single electrons.
The wonder of the device is its extreme sensitivity to the presence of an electric charge. Further, the oxide base is ferroelectric and can retain electrons even when the device is switched off. If the number of electrons in the island is controlled, the device can act as a memory device, in some state, 0 or 1. Stacking many such islands together can create an ultra-dense solid-state memory device, which can be altered by passage of minute amounts of electric current.
Fabrication in the pure state is a problem currently, but this is usually the case when a new solid state device is made for the first time ever. Today’s improved Atomic Force Microscopy (AFM) Techniques allow engineers to fabricate materials directly on the nano-scale with high precision and this is the current line of attack.
If made on a commercial scale, these transistors can make a computer out of quantum processors, having unthinkable speeds, capable of performing calculations in a day, which are estimated to take hundreds of years on today’s supercomputers.
A brand new method to measure gravity and minute quantizations in a gravitational field that uses neutrons entrapped between two vibrating parallel plates immersed in a gravitational field, has been developed by scientists at the University of Technology, Vienna (TU Vienna). Neutrons have earlier been used for electromagnetic (EM) field measurements, but similar methods are now being used to measure gravity, a force which is 10-36 times (i.e. one in a billion billion billion billion parts) as strong as the EM force.
Any field which can be quantized (EM can be quantized; gravity cannot be quantized as yet) contains discrete energy levels, which can be occupied by quantum particles. A particle cannot occupy a space between two successive levels. It may, however, jump (technically, make a transition’) from one quantum state to another, giving off radiation in the process.
A quantum particle in a certain state needs to be excited with just the right amount of energy so that it can make transition to a higher energy state. This process is called resonance’.
For quantizing any field, it has to be bounded in space within some finite range. This is conveniently achieved by limiting the extent of the experimental apparatus between two parallel plates. These plates may even be used to induce transitions, as we will see below.
To probe gravitational fields, neutrons are being confined between two closely spaced parallel plates, which can be vibrated at very precise frequencies. If gravity can, indeed, be quantized, then each of the neutrons sits in one of the energy levels in the gravitational field. By vibrating the plates at a very precise frequency (the resonant’ frequency), just the right amount of energy can be pumped into the system. This energy will then be taken up by the neutrons, which will jump’ to higher quantum levels. By measuring the resonance peaks in the vibrational spectrum, scientists hope to accurately map out the quantum levels in the gravitational field.
Extremely cold neutrons are used instead of atoms or electrons, because they are heavy particles and also uncharged. They are unaffected by EM fluctuations, are nearly non-polarizable and are unaffected by the Casimir force.
Gravity and its Quantization
The problem of trying to quantize gravity started with Einstein, when only the EM force had been quantized. Since then, the weak and the strong forces have been quantized and unified into a single theory. Gravity has survived all attempts of quantization and unification. A primary problem with gravity is that the static space-time background present for the other forces is itself distorted by gravity. (In fact, relativity says that the distortion of space-time is gravity). The results of this experiment might give valuable clues as to the energy scales needed for unification. This may also demonstrate the very limits of possibility of a unified theory (like string theory and its many versions).
The experiment is much smaller in scale than the existent LIGO and can be performed in a laboratory. Questions still remain as to how fine the measurements need to be in order to be fully sure of the result.
The experiment also hopes to verify the validity of the equivalence principle (which says that gravitational and inertial masses are exactly the same) at extremely small length and energy scales. This principle is crucial for the correctness of general relativity (GR), and thus will deliver a verdict on the applicability of GR at quantum scales. A more sophisticated version of the experiment might even be used to probe into the nature of dark matter, but that is still some time away.
No one is sure if this will work, but as Pauli said, He who dares, wins.
WHO sounded alarm bells about Anti Microbial Resistance (AMR) this past World Health Day on April 7th and the warning is not early. This was writ large for a long time. Excessive prescription of antibiotics and, that too, in high doses are making microbes change faster than the speed of drug research. WHO warned of an imminent ‘pre-antibiotic age’, when we will be stuck with a large number of different, but ineffectual, antibiotics.
The threat is especially severe for diseases that spread through the air, water or vector agents. It has long been predicted that TB will develop resistance and the conventional drugs will not work in normal doses. Very high doses may harm the patient more that the bacteria itself, or may have some severe side-effects.
Bacterial strains, like Escherichia coli (E.coli), have already been observed developing medicinal resistance at conventional antibiotic doses. E.coli can mutate with amazing speed. They can go through about 20 generations in four hours, given an aerobic (i.e. oxygen-rich) environment and enough food (glucose). Scientists have observed that about 200 generations are enough for a potentially deadly strain to develop, a feat that E.coli can achieve in 2 days.
Many antibiotics adhere to the bacterial cell wall and slowly break it down. Bacteria change (or, rather, evolve) their cell wall composition slightly disallowing the adhesion, thus become immune. A greater fear is the crossing of genes amongst different bacterial strains producing a strain to which the human immune system has no response. This happened in the case of SARS, although that was a virus at work. E.coli has frequently been observed to cross with highly resistant Salmonella strains.
Multi-drug resistant (MDR) strains are also emerging at an alarming pace. A staggering 440 000 cases of MDR-TB are reported annually causing 150 000 deaths in 64 countries worldwide, according to WHO.
AMR threatens a return to the pre-antibiotic era â€” Many infectious diseases risk becoming uncontrollable and could derail the progress made towards reaching the targets of the health-related United Nations Millennium Development Goals set for 2015.
The first response to this threat is the judicial prescription of drugs by medical practitioners. WHO called upon all who have any stake holding in the medical practice and requested for more responsible behavior. Citizens are also responsible for maintaining a healthy lifestyle.
Just as prevention is better than cure, an immune system is a better defense than any antibiotic.
Researchers from Australia and Japan have recently reported a successful attempt at quantum teleportation of a complex quantum system from a certain point A to another point B without losing information. The team was led by scientists from the University of Tokyo, in the lab of Professor Akira Furusawa. This leads to the possibility of achieving fast, high-fidelity transmission of huge chunks of data, all at once, thus revolutionising the current data transportation scenario and providing a boost to the ongoing research on quantum computers.
Schrodinger and his Cat
The Schrodinger Cat paradox appeared in 1934 and was proposed by Erwin Schrodinger, one of the founders of quantum theory. This thought experiment (‘gedankenexperiment‘) consists of the following setup: A cat is kept in an opaque box with a sealed glass chamber inside it, containing poisonous gas. The glass can be broken by a hammer, which is itself triggered by the decay of a radioactive atom. Since the decay of any radioactive atom is governed by quantum laws and is, thus, entirely probabilistic, no one can say whether the cat is alive or dead with absolute certainty. The answer, however, becomes obvious when one looks into the box. Before the observation, one is forced to conclude that the cat is dead and alive at the same time; it is in a superposition of dead’ and alive’ states. Thus, observation changes the system irreversibly; scientists call it collapse’.
Whereas quantum superposition and collapse are well-accepted by physicists, applying them to macroscopic objects like cats instead of quantum particles makes the situation very strange. The strangeness enticed Schrodinger enough to propose this paradox.
Teleportation, Qubits and the Quantum Computer
The concept of superposition is employed in quantum computers. Unlike a conventional computer, where one bit can have the value 0 or 1, a quantum bit (or qubit) can be in a superposition of the two values. Thus, when there are N bits it represents only one state, where N qubits represent 2N states at once. This greatly augments both speed and volume of computations.
The main problem in achieving quantum computers is that the qubits are highly susceptible to external influences. As discussed above, observing a qubit (i.e. any interaction with the environment) will collapse it into one of the many states that it represents. Thus, a number of qubits cannot be stacked up at a place, unlike computer bits. (Currently, 10 qubits have been ‘stacked’ up.)
Another phenomenon used is that of quantum entanglement. If two states are conjoined at some time, then the observation on one of the states influences the other, even though the latter is not being directly observed. This finds a (hitherto hypothetical) application, which involves performing one operation on a certain part of the computer and, thus, influencing a separate entangled operation. This concept can be exploited in quantum cryptography.
The Japanese team of researchers have successfully teleported a macroscopic system of photons (particles of light), whose phases were superposed.
The situation at the Fukushima Dai’chi nuclear power plant of Japan has just been reported to be worse than previously estimated, but still nowhere close to Chernobyl. A couple of days back, on 12th April, the Japanese Nuclear and Industrial Safety Agency reviewed the situation and updated their previous rating of 5 to a maximum of 7 on the International Nuclear and Radiological Event Scale (INRES). The only incident in history to get the rating of 7 is Chernobyl.
What is INRES?
The INRES scale is a scale used by International Atomic Energy Agency (IAEA) to gauge and compare the different incidents of radioactive spillage or similar disasters. A low rating on the scale involves the misplacement of small doses of lightly radioactive substances and small radioactive doses, which can be easily washed off’. A high rating, like that of Chernobyl, involves a widespread spillage or release of high amounts of radioactive substances into the air or into the sea. The rating can be done for an entire disaster site or for certain specific areas of the site.
Ratings since the tsunami
The INRES ratings have changed a number of times since the tsunami struck the plant. Earlier each of the cores had been rated with a 5, meaning that they posed a danger of emitting large amounts of radioactive material, if not handled with utmost care. The recent ratings were given for the entire plant, and not just of the reactors, adjusting to the spread of radioactive materials in the air and the dumping of the same into the neighbouring sea. It also recognises the fact that the entire area needs to be treated with considerable caution and not just the reactor cores.
Comparisons with Chernobyl
Given that, before Fukushima, Chernobyl was the only incident to score a 7 on this scale, the comparisons are inevitable, but, many experts feel, unfair. Most of the radioactive material released has been towards the Pacific Ocean, where it has no chances of poisoning human habitat areas. Further, the radioactive wastes that have been dumped into the ocean pose quite little threat due to the high degree of dilution. This has called into question the validity and the use of the rating system, especially when it has high potential of misleading the public.
Cleaning up the mess
Cleaning up the Fukushima mess may take more than 10 years, experts feel. The immediate concern is cooling the core with sufficient water, all the while keeping it submerged. This involves pumping water in and out periodically, and then disposing the lightly radioactive water into the sea. The core is kept under about 20 feet of water, which also helps in shielding the radiations from the core. Japanese authorities are also considering using robots for cleanup of the innermost parts of the reactors, but nothing has been implemented as yet.
The situation is more stable than a few days back and the reactors are no longer throwing up new surprises. The cooling system is being repaired and should be operational soon.
The hi-tech whistle blower, Julian Assange, currently at number 10, gets a second nomination by TIME after being beaten last year by Mark Zuckerberg for the Person of the Year Title. The Fukushima Nuclear Power Plant workers get their due mention and they currently hold the 13th position.
Also significant is the presence of Christopher Hitchens, who is currently at number 7 above Julian Assange (No. 10) and Mark Zuckerberg (No. 24). One of the world’s foremost intellectuals, humanists and a hardened atheist, his popularity has soared following the publication of his memoirs Hitch-22. He was diagnosed with esophageal cancer shortly after and people, with or without religious beliefs, have poured their good wishes and prayers for his recovery.
The former head of The Human Genome Project, Prof. Francis Collins, one of the world’s most brilliant geneticists, and one of Hitchens’ debate opponent (Collins is a devout Catholic), has taken it upon him to experiment on Hitchens (with his full consent) and find a cure. Hitchens still writes for Vanity Fair, appears in public debates, never shies away from an interview and remains just as fiery and controversial as always, even though his cancer is in its last stages. Coincidentally, today, the 13th of April, is also his birthday.
Cast your vote for your favourite nominee within the next 24 hours. The winner gets into the TIME 100 list.
IBM on Tuesday, 12th April, announced that they have made the world’s fastest transistor using graphene and also hinted that they might go into commercial production very soon. This is major news, as graphene might revolutionize the current semi-conductor industry scenario. Graphene may even be good enough to replace silicon, the standard material used in all of today’s semi-conductor devices, in the near future.
What is Graphene? How is it produced?
Graphene is a mono-layer of carbon, with the atoms in hexagonal configuration. Each of the carbon atoms has bonds with three neighboring carbon atoms, maintaining, what is called, a sp2 hybridization. Basically, it is one layer of graphite.
It is produced in the most mundane way you can think of. A pure crystal of graphite is repeatedly stripped off using Scotch Tape, until, about 50 or more repetitions later, graphene is found buried amongst poly- or bi-layered graphite (Pic below). This has to be verified, and can be done so optically, after the extracted graphene is mounted on a Silicon Dioxide (SiO2) substrate of correct thickness. Often, Raman spectroscopy is used for verification.
Other methods, which allow graphene to be grown for commercial purposes, are also known. Primary among these is Chemical Vapour Deposition (CVD), in which carbon vapour (obtained from carbon rich substances like acetylene) is deposited on a Ni or Cu substrate.
How did IBM do it?
Graphene grown directly on a SiO2 substrate suffers from the problem of scattering of electrons, resulting in the deterioration of the transport properties and also producing non-uniformity across the SiO2 wafer. The IBM team used a novel substrate called Diamond-like Carbon’ (DLC) on top of the SiO2 layer so as to reduce the scattering. DLC is loosely amorphous (i.e. powdered) diamond. It has all atoms in sp3 configuration (i.e. each atom bonds to four neighbors) and tetrahedral arrangement, but, being a powder, lacks any fracture planes. Thus, having the necessary properties of diamond, it is also flexible and can easily be used as a coating over a substrate.
Graphene couples weakly to the DLC layer and this greatly reduces the scattering, as also the temperature dependence of the material. In fact, the transport properties of DLC-grown graphene remains almost (maybe, exactly) temperature independent right up to 4.3K (which is minus 269 Centigrade).
The IBM team took graphene, made using CVD on a Cu layer, and, after protecting with polymethylmethacrylate (PMMA), dissolved the Cu layer using FeCl3. Then, the PMMA-graphene was transferred to a DLC layer on the SiO2 substrate and the PMMA was got rid of. Raman spectroscopy was used to verify the quality of the graphene layer.
Graphene lends itself readily to fabrication of Field Effect Transistors (FETs). By constructing the gate, drain and source contacts using pure metal and properly calibrating their device, the IBM team achieved input-output characteristics similar to a Si FET. Further, they achieved very high switching speeds up to 26 GHz for a 550 nm long device.
How will it score over Si transistors?
Graphene transistors will be ideal in radio frequency (RF) range signals, due to their high switching speeds. Unlike Si transistors, their properties don’t degrade at low temperatures. This means that there will be no unnatural change in transport properties as the temperature is varied.
The problem graphene transistors have is that they have low on-off voltage ratio. However, that is not a very strict condition for RF communication. A more serious problem is that of high contact resistance, which cannot be minimized, unlike MOSFET‘s in case of Si.
IBM reports that transistors with cut-off frequencies (the frequency at which the current gain becomes unity) as high as 155 GHz have been achieved on a 40 nm device using short gate lengths.
A figure of merit is the product of cut-off frequency and gate length. IBM reports the figure of 13 GHz mm for the 550 nm device, which beats the highest value of 9 GHz mm for Si MOSFETS by a long margin.
It has been variously billed as The Final Frontier’, The Great Unknown’ and The Heavens’. It has enthralled humans since the caveman days. It is mysterious; it is attractive. On 12th April, 1961, space was conquered by a man named Yuri Gagarin, aka The Columbus of the Cosmos. 50 years ago, today, the first manned space-flight took place.
Yuri Gagarin was born on 9th March, 1934. He became a pilot at Soviet Air Force at the age of 22. He was promoted to the rank of Lieutenant after two years, and subsequently, to Senior Lieutenant. He was chosen amongst many others to be one of the nineteen pilots for the Soviet Space Program, and thereafter, to be the one of backup crew for the Soyuz-1 Mission, which ultimately ended in disaster. He was to be a replacement for his dear friend, Vladimir Komarov. (It was reported that Komarov knew about the impending failure of the Soyuz-1 mission before the launch and had anticipated his death. Gagarin was also aware of the problems with the craft and pleaded with Komarov to allow him to take his position, who refused to comply.
The Space Race had begun in 1957 with the Russian launching of Sputnik-1. When the 60’s dawned, the race was just heating up. A new mission was to be launched in 1961 and Gagarin was widely regarded as the man best fit for the job. Vostok -1, as the mission was named, was to be a one-man mission.
On April 9th 1961, 3 days before the actual launch date, Gagarin was chosen for the Vostok-1 mission. Minutes before the launch, Gagarin wished for some music to be played over the radio. At 6:07 AM, the craft was launched. At 6:17 AM, the craft entered orbit, marking a historic moment. Man had stepped into the Final Frontier.
About 8:00 AM, the space craft re-entered the atmosphere. Gagarin safely ejected minutes later and landed with the help of a parachute.
Google honored the first man in space with the following doodle.
This was the first big success of the Space Race on either side. Eight years later, three Americans would set foot on the Moon.
A bill that could be used to shield teachers and education policy makers, who refrain from teaching evolution and global warming in classes, was passed by the state of Tennessee on Thursday. This has got to be good news for the people on the extreme right, predominantly Creationists, who have earlier wanted legal protection for their ideas. The bill was passed by the house by a vote of 70-23.
Creationism and Intelligent Design: An introduction
Creationists hold a view of the world that is inspired by the literalism of the Bible, which claims that the world and all living things are created by God. No natural explanation can explain the creation of the utter complexity and diversity of the living world. The most famous argument in support of this line of thinking was put forward by William Paley in his book Natural Theology’ in 1809. Paley puts forward the, now famous, watchmaker argument. He argues that if one sees a stone in a forest, he/she wouldn’t be interested; if one, however, saw a watch lying on the ground, he/she might ask about the watchmaker. Since living creatures are much more complex than a watch, and if the watch needs a maker, shouldn’t living creatures need a maker too? Fifty years later, in 1859, Darwin published On The Origin of Species‘, which answered this question. The answer was Evolution by means of Natural Selection.
Recently, a different and more subtle form of Creationism, by the name of Intelligent Design(ID), has been gathering steam. It claims that evolution cannot explain all the diverse forms seen in the living world without running into problems of ‘irreducible complexity‘ and there needs to be an intelligent driving force.
History: Kitzmiller vs Dover
One of the most potent challenges to evolution came from a group of people, led by the Discovery Institute, in 2005 which led to the famous Kitzmiller trial. The world media, which descended on the sleepy town of Dover, covered the brilliant testimonies of a number of people from the pro-evolution side, especially those of Ken Miller, a professor at Brown University. The case ended on 20th Dec, 2005, with the judge ruling heavily in favor of the evolutionists. The Discovery Institute remains undaunted and there will be future challenges.
This bill prohibits the state board of education and any public elementary or secondary school governing authority, … from prohibiting any teacher in a public school system of this state from helping students understand, analyze, critique, and review in an objective manner the scientific strengths and scientific weaknesses of existing scientific theories covered in the course being taught, such as evolution and global warming.
Looks good on the face of it, right? It does, till you read the last clause. That makes it clear that evolution and global warming can be targeted, and that too, legally. So, if any teacher wants to argue to his/her students that the Earth is 6000 years old, there will be a law to protect the teacher.
The bill has still to be ratified by the Tennessee Senate and it will be put up for voting on the 20th of April. So some hope remains.
This is big, really big! This may be the biggest news to hit the particle physics world in the the last 50 years. Scientists, analyzing the data collected at the Tevatron, Fermilab, have detected an anomaly that could well usher in a new dawn in theoretical physics and change the Standard Model as we know it now. The observation was a bump in the data, but in the ‘wrong’ place.
Scientists are excited about a Gaussian peak that has been observed on Wednesday, 6th April, centered at about 150 GeV with a spread of 2.5 GeV, corresponding to nearly 300 events.
Physicists are generally quite skeptical about any news of big breakthroughs. This ensures that the discoveries are really authentic. Most ‘discoveries’ are just mistakes in the code being used for data-analysis, or some human error or plain background fluctuations. All of these have to be ruled out. Coding errors can be ruled out by using many orthogonal samples of data, called ‘control sets’. Background fluctuations take a bit more effort, but routine analysis can eliminate it almost completely. A peak left after background elimination cannot be discarded.
Notwithstanding the fact that physicists are extremely skeptical, almost all major discoveries in high energy physics have been accidental. The key to such a discovery is rigorous analysis of data.
Is this the Higgs Boson?
The knee-jerk reaction was to suspect the discovery of the Higgs, the bosonic particle that is believed to endow all fundamental particles with mass. The Higgs boson, however, is ruled out, because if the Higgs could be produced at 140 GeV at a non-negligible rate, then we expect to see the characteristic decay jets, which would consist, mainly, of bottom quarks. However, such jets have not been observed, ruling out this possibility.
Is this a new force of nature?
It is too early to comment. The discovery of a new particle – a new boson – has to be confirmed. Only further investigation can answer this question.
What could fit the fill?
A new particle, which was coincidentally proposed in a paper a few days back, could fit the bill. The particle is called the Z’ boson, as compared to the Z boson. The Z’ boson is expected to decay via semi-leptonic (i.e. a mixture of hadrons and leptons) channels. Semi-leptonic jets have been observed. So, maybe, the Z’ is the ‘new’ particle.
The result now stands with a 3-sigma confidence level. This means that the possibility of ruling out the observation as a statistical fluctuation is less than 1 percent. Physicists look for a 6-sigma confidence level, which means that the doubts should reduce to less than 0.003%. To attain this level of confidence, scientists will need more sets of data and rigorous analysis of the same.
More data is on the way. As Prof. Nima Arkani-Hamed of the Institute for Advanced Studies, Princeton, notes, LHC should come up with much more data and copious events, if this is indeed a real discovery.
One thing is for sure: this is exciting. If this is true, this is pure gold for particle physicists.
UPDATE: Fermilab rejects new particle discovery after extensive data analysis. Read here.
The real fear everyone has always had about nuclear power was the waste. The troubled Japanese nuclear power plant, Fukushima Dai’chi, is dumping radioactive materials into the neighboring sea to dispose of it. This has raised a few alarms, but there is not much to worry about right at this time.
The sea water spreads the radioactive materials a long way, but also dilutes them in the process. The radiation level for Iodine (I-131) and Cesium (Cs-137) drops to about a thousandth once it moves about 20 miles offshore, according to Ken Buesseler of the Woods Hole Oceanographic Institution in Woods Hole, Massachussets.
The saving grace for the marine life around Japan is the powerful Kuroshiro ocean current, which blocks the contamination from moving southwards and thus affecting life in Tokyo Bay.
The release of the wastes into the sea is really the best possible alternative, given the fact that sea water has been seeping into the nuclear plant ever since the tsunami and the slightly radioactive water needs to be released so that room can be made for storing the more radioactive of the wastes.
Effect on sea-life:
Experts claim that the effect on the surrounding sea-life given the current state will be minimal. There will, of course, be an increase in the radiation levels, but this increase is not dangerous. Most marine creatures, especially fish, are fast moving and thus they will not be exposed to radiation continuously for a long time. The animals are expected to cope with the increased levels.
Effect on sea-food lovers:
The risks of genetic mutation rapidly go to zero as one goes even 5km offshore. Even within this ‘danger zone’, the radiation level is not high enough to worry genetic experts enough. Fishing in this region has been disallowed. Sea food, whether being consumed inland or exported, will be checked for radiation levels for quite sometime.
Experts measured the radiation levels on the Western coast of USA (California) and reported a minor nuclear fallout, but this is nothing very serious.
The situation, at the present, doesn’t look too bad and it is not expected to worsen much.