NASA: Jupiter Spacecraft Detects Problem, Turns Off Camera
By Alicia Chang
Los Angeles -- A NASA spacecraft circling Jupiter has hit another snag. The space agency said Wednesday that Juno detected a problem, went into safe mode and shut off its cameras and instruments hours before it was supposed to pass over Jupiter's dense
Japanese Scientist Wins Nobel for Study of Cell Recycling
By Malcolm Ritter And Karl Ritter
NEW YORK _ Like a busy city, a cell works better if it can dispose of and recycle its garbage. Now a Japanese scientist has won the Nobel Prize in medicine for showing how that happens. The research may pay off in
Human DNA Tied Mostly to Single Exodus From Africa Long Ago
By Malcolm Ritter
New York -- The genetic ancestry of people living outside Africa can be traced almost completely to a single exodus of humans from that continent long ago, new studies suggest. Still, a tiny legacy from an
Pluto 'Spray Painting' Poles of Its Big Moon Charon
By Marcia Dunn
Cape Canaveral, Florida -- The paint is actually Pluto's continually escaping atmosphere. Methane and other gases from Pluto end up coating Charon's frozen poles, which are so cold and where winters are so long that this buildup remains for decades.
Asteroid Bennu Getting First Visitor in Billions of Years
By Marcia Dunn
Asteroid Bennu, a black roundish rock taller than the Empire State Building, is the intended target of a NASA spacecraft set to blast off Thursday night. Not only will the robotic probe named Osiris-Rex fly to this ancient asteroid, it will scout it out for two years before
Plight of African Lions Persists One Year After Cecil Killing
By Christopher Torchia
Johannesburg -- Some call it the Cecil the lion effect.
A year ago, an American killed a lion in Zimbabwe in what authorities said was an illegal hunt, infuriating people worldwide and invigorating an international campaign against trophy hunting in Africa. Some conservationists, however
For years physicists have posed theories that have tried to explain why the universe doesn’t just fly apart, given what appears to be quite a bit of empty space between star systems, planets and galaxies.
Is Fusion Energy a Viable Alternative for the Future?
Especially after fuel prices increased as they have the past decade, there’s been more talk about relying on alternative sources of energy like wind power and solar power.
While those sources are important in terms of weaning the world from many fossil fuels in the long-term, given their nature, at best, they can do little more than supplement a major source of energy.
Still, the long-term outlook for fossil fuels like petroleum is beset by peak production and ever greater competition for their benefits.
Developing countries like India and China have an ever greater need for energy sources.
There are additional reasons for the increase in the price of oil, due, in part, to the devaluing of the dollar.
Still, there’s a definite need to develop and make available to the world’s population alternative sources of energy.
If solar energy and wind power cannot completely replace fossil fuels as a major source of energy, what is a possible alternative?
Over the years some proponents of alternative energy sources have said that nuclear energy is a likely candidate.
However, as the disaster at Fukushima last year revealed that nuclear power as a possible replacement for the use of fossil fuels isn’t a viable option.
Nuclear power plants can and should be made safer; emergency power to keep the cooling pools in which spent nuclear rods are stored must be assured. And construction standards need to be improved and upgraded to account for earthquake risks and possible tsunamis.
However, last year’s disaster showed that nuclear power isn’t a likely candidate to replace fossil fuels as a major source of energy.
What alternative is there?
Perhaps the alternative, or at least a likely alternative, concerns fusion power, which hitherto hasn’t been viable alternative due to the extreme difficulty of making it practical to use on a commercial or regular schedule.
What are some of the obstacles to making fusion power commercially viable or practical for generating electricity?
For one thin, a power plant would have to be made of material that could withstand enormous temperatures, which would be required to produce fusion.
The material would have to be able to avoid becoming radioactive and brittle due to the bombardment of high energy nuclear particles, most likely neutrons.
One way of creating fusion is by using high-powered lasers that would be amplified many times after passing multiple stages.
Until recently one of the problems with this approach was that scientists had to invest much more energy into the lasers than the energy that was derived from the fusion of two hydrogen atoms. In sum, energy invested was greater than energy realized.
The hope is that the energy released by fusion could be used to run a power plant.
Another method of fusing hydrogen atoms involves using high-powered magnets, which would hold hydrogen isotopes together and heat them with microwaves. In effect, the pressure exerted by the magnets would hold the plasma—the soup containing the electrons shorn from the hydrogen atoms and the protons and neutrons of those atoms—together—to keep it from “spilling” or escaping.
The microwaves would heat the plasma to the extremely high temperatures needed to achieve fusion, at least in theory.
On the other hand, the laser process focuses a high-powered laser on a pellet, which contains the two hydrogen isotopes (deuterium and tritium).
The high-powered laser—in effect, a laser that’s been amplified many times—would crush the pellet’s core such that hydrogen isotopes inside would fuse together, releasing helium, a neutron and a great amount of energy, which, in turn, would, one presumes, eventually power steam turbines that would then generate electricity.
Thus, it’s said that a fusion reaction is much like the reactions that occur in the sun, which is comprised mostly of hydrogen with some helium. In the sun, of course, the temperatures are extremely high and there is a great amount of pressure constantly being exerted, creating fusion.
One of the major difficulties with fusion is maintaining the process in a continuous way.
It’s one thing to recreate some of the sun’s processes in a controlled environment, in a lab, but it’s another to replicate those processes on a long-term and continuous basis.
In addition, a fusion power plant would have to make its own tritium, which, unlike deuterium, is not readily available; for its part, deuterium comes from seawater and, thus, is abundant. But the hydrogen isotope tritium would have to be made in the fusion reactor.
As mentioned above, the laser process involves boosting or amplifying the power of the laser beam many times before it reaches the chamber containing the pellet, which itself contains the hydrogen isotopes.
At the National Ignition Facility at the Lawrence Livermore Laboratory, the world’s largest and most powerful laser system, the initial weak laser is split and sent through preamplifiers; then it passes through amplifier glass slabs via 192 separate beam channels.
The process is repeated over 52 passes; this increases the laser’s power by about 25% on each pass.
Eventually, after all the required passes, the lasers converge on the sides of the gold hohlraum, which holds the pellet; the hohlraum is located in the target chamber. The target chamber is about 30 feet in diameter and has a number of portals through which the lasers pass.
After the lasers converge onto the sides of the hohlraum, x-rays are emitted.
In turn, the x-rays burn off the outer layer of the pellet, compressing the inner pellet many times and heating it to 100 million degrees, the temperature necessary for fusion.
The sudden surge in pressure and temperature triggers fusion. The pressure needed is what’s found at the center of the stars.
These temperatures and pressures are necessary ion order to overcome what’s called the electromagnetic repulsion of the isotopes (like charges repel) and to induce fusion.
All electrons are torn from the deuterium and the tritium atoms during the fusion process. But, again, only extremely high temperatures and pressures can achieve this reaction.
And one of the early problems associated with the use of lasers was that scientists couldn’t reach breakeven—the point where fusion produces as much energy as the lasers put in.
In any event, the resulting mix—due to high levels of compression and temperatures (the tearing from the isotopes their electrons)—is called plasma.
When deuterium and tritium fuse they release helium, a neutron and large amounts of energy.
According to Scientific American, the laser that goes into the spherical chamber containing the pellet has 4.2 million joules of energy. Wikipedia defines a joule as “energy expended or work done in applying a force of one neutron through a distance of one meter or in passing an electric current of one ampere through a resistance of one ohm for one second. A joule is work required to move an electric charge of one coulomb through an electrical potential difference of one volt.”
Another way to say this is to say that a joule is the work required to produce one watt of power for one second. So, four million joules….
Well, anyway, the laser that causes the pellet to implode requires a tremendous amount of energy and heat.
But for purposes of making fusion a practical source of energy the energy emitted by this process must exceed that level (greater than 4.2 million joules of energy), and it would have to be sustained for twenty four hours a day, every day, less the time the power plant is down due to maintenance and the like.
Another method of producing fusion and one that allows the process to go on longer than for one second or so involves the use of magnetic fields that can, one hopes, control and hold plasma, which wants to “leak out”.
It’s been hoped that the plasma could be held in place, even while increasing temperatures and pressures, by using extremely powerful magnets and microwaves.
It’s been found that the plasma was very difficult to hold in place. Yet it must be compressed in order to induce pressure for fusion to take place.
Researchers have found that the more pressure is applied, the more the plasma would find a way to “squirt” out the sides.
In effect, plasma wants to escape the more heat and high temperatures are applied, but without enough heat and compression fusion can’t take place—the paradox.
Thus, scientists have been on a quest to make even larger and more powerful magnets to hold the plasma together in order for fusion to occur.
In essence, the problem scientists have been trying to solve relates to the quirky behavior of plasma—it wants to squeeze out.
Still, there have been superconducting magnets used to hold in place plasma. In addition, beams of microwaves are used to heat the plasma to 150 million degrees Celsius.
This is the process that’s used at the ITER project in southern France.
The advantage of this process is that unlike the use of lasers, it can be sustained for longer periods of time.
Whatever process is ultimately used on regular and long-term basis, it seems that one of the benefits of fusion is that it doesn’t create long-lived radioactive products, even if some of the stable materials, such as thick steel used to contain the fusion reaction, itself become radioactive due to the bombardment of neutrons.
But how is heat transferred from the energy created by fusion itself?
This question is important, as it relates to how fusion can be used to generate electricity and power.
According to designs for possible power plants, there would be what’s called a blanket surrounding the fusion core.
The blanket would be made of a thick metallic material such that it would withstand the many collisions with neutrons being emitted from the process of fusion. At the same time, the collisions will provide heat that ultimately would be used to generate electricity.
The plan envisions using a molten salt that would absorb and draw away heat from the reactor. The hot salt would then be sued to power a steam turbine that would generate electricity.
One of the drawbacks is that while deuterium is abundant (found in ocean water), tritium is rare and would have to be obtained by the fusion reaction.
In order to harvest tritium from the fusion process, it’s envisioned that the blankets would contain lithium, which would react with the neutron emitted from the fusion reaction.
The additional neutron created by fusion would react with a lithium atom to produce helium and tritium.
In turn, that reaction would produce another neutron that, in its turn, would react with another lithium isotope and so on.
The tritium thereby produced would be used by the reactor for additional fusion.
The tritium isotope would be re-injected into the plasma for a new fusion reaction. This process would continue many times in order to sustain it.
One of the obstacles, therefore, is creating a material for the blanket that it would withstand many collisions with neutrons and the great amounts of heat created by those collisions.
Reportedly, ITER, the facility in France, won’t be testing blanket designs but will be concentrating on creating fusion by using microwaves and powerful magnets.
A difficulty with the laser process is that pellets have to be precisely rounded to ensure that they compress evenly from all sides in order to compress sufficiently to produce fusion.
So far, this process is extremely expensive. For example, to make only one pellet costs at least $1 million, and only several can be made in one day. But a typical power plant would use up to 100,000 per day.
These are some of the problems associated with fusion, but the potential is great and exciting.
If more money and time were spent researching and developing a viable and practical way of generating electricity, so much of life would be altered substantially.
Perhaps the investment is worth it. Given enough of resources and time, it’s more than possible to overcome the obstacles.
Even if fusion power were developed for everyday use, this isn’t to say that solar and wind power would become unnecessary. Whatever the primary source of energy will be, wind and solar form important elements.
According to a report in a recent issue of the Lancet Oncology, almost 17% of cancers are due to a viral or bacterial infection, like hepatitis C or bacterial infections in the stomach.
The study looked at 184 countries and found that four types of infections that can be prevented either by using antibiotics or by using vaccines. It also found that incidence of cancers that were related to infections was about three times higher in developing nations.
The types of cancer associated with infections were found to be hepatitis B and C viruses, human Papilloma viruses (viruses that can result in cervical cancer) and Helicobacter pylori (infections of the stomach).
Many of these cancers are preventable by the use of vaccines to prevent viral infections or by treating bacterial infections, such as helicobacter pylori (infections of the stomach), according the study’s findings.
In general, cancers are caused by cell mutations and unrestrained growth of cells due, in part, to environmental and/or genetic factors.
The Lancet Oncology study was funded by Fondation Innovations en Infectiologie (FINOVI) and the Bill & Melinda Gates Foundation (BMGF), according to the summary at Lancet's website:
Some have described cancer as resulting from cells that don't expire and never reach maturity; in effect, they keep growing, due to mutations, genetically or environmentally caused.
Wikipedia describes some of the mechanisms:
"Cancer is fundamentally a disease of failure of regulation of tissue growth. In order for a normal cell to transform into a cancer cell, the genes which regulate cell growth and differentiation must be altered.
The affected genes are divided into two broad categories. Oncogenes are genes which promote cell growth and reproduction. Tumor suppressor genes are genes which inhibit cell division and survival. Malignant transformation can occur through the formation of novel oncogenes, the inappropriate over-expression of normal oncogenes, or by the under-expression or disabling of tumor suppressor genes. Typically, changes in many genes are required to transform a normal cell into a cancer cell.
Genetic changes can occur at different levels and by different mechanisms. The gain or loss of an entire chromosome can occur through errors in mitosis. More common are mutations, which are changes in the nucleotide sequence of genomic DNA."
In December last year Reuters reported on the progress of research into the Higgs boson, a particle that is theorized as giving mass and energy to other particles (energy and mass are said to be interchangeable).
It’s been theorized that it imparted those qualities when the Big Bang occurred almost 14 billion years ago.
At that time scientists at CERN, the large Hadron Collider, Europe’s main high energy physics laboratory near Geneva, Switzerland, said that their research seems to indicate the existence of the boson, although they cautioned that no firm conclusions could be made in that regard.
In sum, they reported that they had found signs of the Higgs boson, which is also described as an elementary “sub-atomic particle believed to have played a vital role in the creation of the universe after the Big Bang.
The problem with trying to find the particle is that it’s short-lived and quickly decays into other particles, after millions of collisions.
A few days ago Scientific American published an article about the ongoing studies at CERN. That article appeared originally in Nature magazine.
According to the more recent report, scientists have been tweaking computer code in order to better discern the presence of the particle, apparently because too many particles are “piling up”, as a result of trillions of collisions being conducted at the super-collider.
In order to try to recreate the conditions that are said to have existed at the time of the Big Bang, scientists have been conducting trillions of collisions of protons in the complex of the underground collider.
The Large Hadron Collider, or LCH, has “been squeezing trillions of protons into ever-smaller bunches, and smashing those bunches together tens of millions times per second.”
In addition, researchers at CERN are endeavoring to increase the number and energy of collisions. It’s estimated that their studies will go on for at least another year.
Researchers say that finding conclusive evidence to the existence of the Higgs boson would go far toward explaining how the universe was created during the Big Bang.
The Big Bang theory says that after the initial explosion, the universe has been expanding and the constituent parts of the universe keep moving away from each other at a faster rate.
Knowing more about the mechanisms of the explosion and the aftermath is important, irrespective of one’s cosmology.
How did the universe begin? Is the way the question is worded more the difficulty than finding the answer to that question?
In any event, according to the physicist Lawrence M. Krauss, the answer seems clear.
The universe and billions of galaxies started in an infinitesimally dense and small point.
The universe has been expanding from that event for almost 14 billion years.
More than that, many physicists theorize that the universe is continuing to expand and that the acceleration of that expansion is increasing over time.
The ideas and theories about the universe many physicists and astronomers have about the origins and movements of the universe have been around for at least a century.
Thus, it’s hard to imagine that there was time when physicists like Sir Isaac Newton and even for a time Albert Einstein posited a static, unchanging universe.
For some cosmologists a universe that came to be from a single event, an explosion of atomic and subatomic particles, suggests that some higher power or intelligence created the single event.
But in his article, “A Universe From Nothing: The Belgian Priest and The Puzzle of The Big Bang”, professor Krauss describes the evolution in thinking of Georges Lemaitre, the 20th century Belgian priest and physicist.
According to Krauss, Lemaitre posited a universe that started from an infinitesimal point (“Primeval Atom”).
At first Lemaitre thought that this beginning suggested the works of a higher power, but later he discarded that notion, saying that his theory of the Big Bang remains outside religious or metaphysical parameters.
Other physicists and theoreticians have posited not only a Big Bang but what they call a Big Crunch, which is when the universe stops expanding and contracts back on itself.
Whether the universe turns back on itself or whether it expands indefinitely turns, in part, on the “cosmological constant”, the amount of density of matter and energy, whether the universe is “open” or “closed” and so on. For that matter, just what is dark matter, and how does it affect whether the universe will contract in on itself so many billions of years from now?
If one accepts the Big Bang as a theory, it's also tempting to ascribe metaphysical interpretations, as in the universe is like an infinitely repeating CD. Thus, the need to talk about a creator is unnecessary, if not undesirable.
But whatever theory one subscribes to, it’s not necessary to assume that the universe had to be “created” by a higher power.
If it’s true that the universe was created by a higher power, what or who created the higher power that set in motion the workings of the universe?
Doesn’t even the “first cause” have to have a first cause, ad infinitum?
Is it necessary to assume the existence of a creator to explain the workings and patterns of the universe?
The Royal Society issued a report on the world’s population and resources that makes for sobering reading.
In a kind of Malthusian vein, the study talked about the world’s resources in relation to a growing population that it projects will outstrip food production in parts of the world.
It’s expected that the global population will reach the 9 billion mark in 2050; as it is, the world’s population is about 7 billion. That’s the number it reached last October, according to a UN report issued at the time.
What are the Royal Society’s recommendations?
For one, it calls on the developed, rich world to reduce consumption.
For another, it advocates policies that are intended to reduce equality by redistributing some of the rich world’s wealth.
As well, it recommends stabilizing the world’s population. A less expensive way to achieve a stabilized population is by distributing to all women effective contraceptives.
As was expected, the Royal Society’s report recommends reducing levels of human-created carbon dioxide emissions.
Another way of reducing fertility rates is to make education and economic opportunities to more people.
Studies have shown that when economic conditions are better and education is universally available women tend to have fewer children; more than that, it’s said that women have an incentive to want smaller families as their economic and educational opportunities increase.
Thus, the report recommends improving conditions such that fertility rates, which are highest in a number of African and Middle Eastern countries, will decrease.
The authors of the study say that women in the countries with the highest fertility rates want to have smaller families.
A high fertility rate is around 4; a fertility rate of about 2 is the rate that prevails in many developed countries. A fertility of 2 is the rate at which the population is replaced. At lower rates the absolute number of people declines over time.
Several years ago the Economist fretted about Italy, Japan and German, saying that the fertility rate in those countries was less than 2.
In general, it would seem that the Royal Society report starts from the hard to deny premise that the world’s resources are finite.
Whether one supports all of the recommendations, the idea that world is finite is difficult to deny.
Even after Microsoft’s announcement last week that it would be selling patents and licenses to Facebook, Yahoo, which initiated a lawsuit against Facebook earlier this year alleging infringement of its patents, vowed to press ahead with its lawsuit against the social network behemoth.
Indeed, according to a report in Reuters, Yahoo officials say that the recent announcement by Microsoft undergirds its position, saying, “Companies who purchase patents are often working from a position of weakness and take these actions to strengthen their announcement as a validation of our case against Facebook.”
Yahoo has accused Facebook of infringing on its patents, many of which are related to online advertising.
Earlier last week APS Radio News reported on Facebook’s declaring on Monday that its profit in the first quarter dropped about 12%.
Facebook is set to implement an IPO, or initial public offering, of its stock in May.
However, the founder of Facebook, Mark Zuckerberg, likely will retain effective control of over 50% of stock, according to observers.
Recently Forbes magazine published an article about how accounting firms like Ernst & Young interact with companies like Facebook and Zynga when performing audits and when they perform consulting services for the same company they’ve been auditing.
According to Forbes, accounting rules, which apply to companies in the non-tech sector, don't apply to start-up techs like Facebook. Thus, a company like Ernst & Young is allowed to do consulting work on technical and internal accounting issues for a startup. Then when the startup is getting ready to implement its IPO--going public--the accounting firm--in this case E&Y--may put on its independent auditor hat.
For example, the revenue a company earns often turns on how deferred revenue is defined versus current revenue.
Given the interaction of Zynga, which provides games offered on Facebook’s website, and Facebook in relation to purchased credits for use in the games, deferred revenue and current revenue all play into how much money a startup is actually making.
Put another way, what does a company’s annual report really mean in terms of the numbers and data presented? How should those numbers be interpreted?
In part, it depends on the accounting firm that does the consulting and the auditing and the way current and deferred revenue are defined.
Perhaps this is a quote from Forbes well worth considering, especially for those who are thinking about investing in tech start-ups:
“..in its fourth amended registration statement, filed with the SEC in October of 2011, Zynga noted that during the first half of last year it estimated the blended average paying player life for a game as 15 months, down from 19 months a year earlier. The shorter player life increased GAAP revenue for the six months by $27.3 million, turning a loss for the six months ended June 30, 2011 into a net profit of $18.1 million. Well-timed: This change came just before Zynga went public in mid-December at $10 a share.”