Social Icons

17 Januari 2020

The History Of UV

Most people are not aware of the long history of UV in water disinfection and that UV is rapidly gaining popularity as an effective alternative non-chemical water disinfection.

Ultraviolet disinfection of water has a long and well-proven history. UV light has long been accepted as an effective germicidal treatment, and has been installed in many major public drinking water and wastewater treatment plants worldwide.




Although it’s taken a long time for the technology to become widely adopted, UV has been around for a long time. In 1877, the germicidal properties of sunlight were discovered and it was only a matter of time before people tried to apply this knowledge for practical use. In 1903, Niels Fensen received a Nobel Prize for his use of ultraviolet light to combat tuberculosis (although not in water), and in 1910, the first drinking water disinfection system opened in Marseilles, France.

From that time on, the technology changed very little until the 1930s, when the first tubular lamps were developed, allowing for easier applications and different configurations for use. In the 1950s, the first truly significant research of UV disinfection began. By the 1960s, UV disinfection was becoming more widely used in commercial applications and was creeping into the residential market.

Today, ultraviolet disinfection is widely accepted as an effective treatment for the removal of microbiological contaminants from water.

Even highly chlorine-resistant microbes such as Giardia and Cryptosporidium can be effectively disinfected from water with UV. NSF-certified UV systems are becoming an increasingly popular alternative to chemical treatment for many applications.

viqua.com



Ultraviolet (UV) Radiation
What is UV radiation?


Ultraviolet (UV) radiation is a form of electromagnetic radiation that comes from the sun and man-made sources like tanning beds and welding torches.

Radiation is the emission (sending out) of energy from any source. There are many types of radiation, ranging from very high-energy (high-frequency) radiation – like x-rays and gamma rays – to very low-energy (low-frequency) radiation – like radio waves. UV rays are in the middle of this spectrum. They have more energy than visible light, but not as much as x-rays.

There are also different types of UV rays, based on how much energy they have. Higher-energy UV rays are a form of ionizing radiation. This means they have enough energy to remove an electron from (ionize) an atom or molecule. Ionizing radiation can damage the DNA (genes) in cells, which in turn may lead to cancer. But even the highest-energy UV rays don’t have enough energy to penetrate deeply into the body, so their main effect is on the skin.


UV radiation is divided into 3 main groups:
1. UVA rays have the least energy among UV rays. These rays can cause skin cells to age and can cause some indirect damage to cells’ DNA. UVA rays are mainly linked to long-term skin damage such as wrinkles, but they are also thought to play a role in some skin cancers.
2. UVB rays have slightly more energy than UVA rays. They can damage the DNA in skin cells directly, and are the main rays that cause sunburns. They are also thought to cause most skin cancers.
3. UVC rays have more energy than the other types of UV rays. Fortunately, because of this, they react with ozone high in our atmosphere and don’t reach the ground, so they are not normally a risk factor for skin cancer. But UVC rays can also come from some man-made sources, such as arc welding torches, mercury lamps, and UV sanitizing bulbs used to kill bacteria and other germs (such as in water, air, food, or on surfaces).

15 Januari 2020

How a Nuclear Accident Destroyed a Soviet Submarine and Damaged the Environment

Many Cold War accidents were kept secret until later declassified. That was the case with Soviet submarine K-431 and it wasn't revealed until later just how bad the entire incident was.

In 1985, a Soviet submarine undergoing a delicate refueling procedure experienced a freak accident that killed ten naval personnel. The fuel involved was not diesel, but nuclear, and the resulting environmental disaster contaminated the area with dangerous, lasting radiation. The incident, which remained secret until after the demise of the USSR itself, was one of many nuclear accidents the Soviet Navy experienced during the Cold War.


Image: Wikipedia

The Soviet Union’s nuclear war planners had a difficult time targeting the United States. While the United States virtually encircled the enormous socialist country with nuclear missiles in countries such as Turkey and Japan, the Western Hemisphere offered no refuge for Soviet deployments in-kind.

One solution was the early development of nuclear cruise missile submarines. These submarines, known as the Echo I and Echo II classes, were equipped with six and eight P-5 “Pyatyorka” nuclear land attack cruise missiles, respectively. Nicknamed “Shaddock” by NATO, the P-5 was a subsonic missile with a range of 310 miles and 200- or 350-kiloton nuclear warhead. The P-5 had a circular error probable of 1.86 miles, meaning half of the missiles aimed at a target would land within that distance, while the other half would land farther away.

The missiles were stored in large horizontal silos along the deck of the submarine. In order to launch a P-5 missile, the submarine would surface, deploy and activate a tracking radar, then feed guidance information to the missile while it flew at high altitude. The system was imperfect—the command link was vulnerable to jamming, and the submarine needed to remain on the surface, helpless against patrol aircraft and ships, until the missile reached the target. Eventually the P-5 missiles were withdrawn and the P-5 missile was replaced with the P-6, a similar weapon but one with its own radar seeker for attacking U.S. aircraft carriers.


The introduction of the P-6 gave the Echo II a new lease on life. By 1985, the submarine K-431 was already twenty years old but still technically useful. Like all Echo IIs, K-431 was powered by two pressurized water reactors that drove steam turbines to a total of sixty thousand shipboard horsepower. As old as it was, K-431’s nuclear fuel supply needed replenishing, and by early August the process had started at the Soviet Navy’s facilities at Chazhma Bay.

On August 10, the submarine was in the process of being refueled. Reportedly, the reactor lid—complete with new nuclear fuel rods—was lifted as part of the process. A beam was placed over the lid to prevent it from being lifted any higher, but incompetent handling apparently resulted in the rods being lifted too high into the air. (One account has a wave generated by a passing motor torpedo boat rocking the submarine in its berth, also raising the rods too high.) This resulted in the starboard reactor achieving critical mass, followed by a chain reaction and explosion.

The explosion blew out the reactor’s twelve-ton lid—and fuel rods—and ruptured the pressure hull. The reactor core was destroyed, and eight officers and two enlisted men standing nearby were killed instantly. A the blast threw debris was thrown into the air, and a plume of fallout 650 meters wide by 3.5 kilometers long traveled downwind on the Dunay Peninsula. More debris and the isotope Cobalt-60 was thrown overboard and onto the nearby docks.

According to Nuclear Risks, the accident scene was heavily contaminated with radioactivity. Gamma ray radiation was not particularly bad; at an exposure rate of five millisieverts per hour, it was the equivalent of getting a chest CT scan every hour. However, the explosion also released 259 petabecquerels of radioactive particles, including twenty-nine gigabecquerels of iodine-131, a known cause of cancer. This bode very badly for the emergency cleanup crews, especially firefighters who needed to get close to the explosion site, and the nearby village of Shkotovo-22. Forty-nine members of the cleanup crew displayed symptoms of radiation sickness, ten of them displaying acute symptoms.

One bright spot in the incident was that the it had involved the new fuel rods and not the old ones, and thus large amounts of particularly dangerous isotopes generated during nuclear plant operation, such as strontium-90 and cesium-137, were not present. While the Chazhma Bay region appears contaminated to this day with radiation, it is unknown how much of it is the result of the K-431 incident and how much the result of the many nuclear-powered submarines that were junked and forgotten in the area.


Nuclear Accident

14 Januari 2020

Did Astronomers Just Discover Black Holes from the Big Bang?

Gravitational waves attributed to the collision of two neutron stars could have been produced by something much stranger

By Nola Taylor Redd on January 13, 2020

In the nearly five years since their first direct detection, gravitational waves have become one of the hottest topics in astronomy. With facilities such as the Laser Interferometer Gravitational-Wave Observatory (LIGO), researchers have mostly used these ripples in spacetime to study the inner workings of merging black holes, but LIGO has also detected gravitational waves from other sorts of celestial crashes, such as the collisions of ultradense stellar remnants called neutron stars. Sometimes, however, LIGO serves up gravitational waves that leave astronomers scratching their heads—as was the case for GW190425, an event detected last April that was recently attributed to a neutron star merger.

Snapshot from the central region of a numerical simulation of two merging neutron stars. It shows the stars stretched out by tidal forces just before their collision. Credit: CoRe/Jena FSU


The trouble is that LIGO’s data suggest this neutron star pair was substantially overweight—collectively, some 3.4 times the mass of the sun, which is half a solar mass heavier than the most massive neutron star binaries ever seen. “It is the heaviest known by a pretty wide margin,” says Chad Hanna, an astrophysicist at Pennsylvania State University who hunts gravitational waves.

That extra weight has some theorists suspecting that GW190425 did not arise from colliding neutron stars at all but rather something much more exotic: A merger of two primordial black holes (PBHs), never before seen objects that are considered a dark horse candidate for dark matter—the invisible, unidentified something that makes up most of the matter in the universe. Theorized to have formed from density fluctuations in the very early universe, these ancient black holes could still exist today and could explain the mass discrepancy identified in the recent LIGO observations.

Almost a half-century ago, cosmologist Stephen Hawking proposed that PBHs could have sprung fully formed from regions of the infant universe that were especially dense with matter. Since then, the idea’s popularity among astrophysicists and cosmologists has wildly waxed and waned. Today, in the absence of direct evidence for their existence, PBHs are seen by many researchers as a hypothesis of last resort, only to be considered when no other scenario readily fits observations. The possibility that PBHs are real and widespread throughout the universe cannot yet be dismissed, however—especially as searches for other dark matter candidates come up empty.

PBHs make an appealing candidate for dark matter for several reasons, but the most important one is that, being black holes, they are quite dark yet still pack a hefty gravitational pull. Despite that fact, Hanna says that if PBHs were abundant enough to account for all of the universe’s dark matter, astronomical surveys that hunted for them should not have come up empty. Consequently, he adds, PBHs can only make up a small fraction of dark matter—if they exist at all.

Not everyone agrees. “Primordial black holes can comprise the whole of dark matter,” says Juan García-Bellido, a theoretical cosmologist at the Autonomous University of Madrid. The trick, he adds, is for the ancient objects to exhibit an array of masses rather than a single definitive size. If PBHs run the gamut from a thousand times less massive than the sun to a billion times larger, they could make up all of the universe’s dark matter. “All published constraints that claim to rule out primordial black holes as dark matter assume they exist in a monochromatic, or single-mass, spectrum and are uniformly distributed in space,” García-Bellido says. For such large mass ranges to manifest, the PBHs would have to cluster in compact groups in which they could occasionally collide, merge and grow larger.

Because PBHs would have been created shortly after the big bang, they initially could have easily connected with one another. The early universe was a much smaller place than it is today after dramatically expanding for nearly 14 billion years, making it easier for the objects to find other PBHs and pair up with them. As the universe continued to expand, and the first stars and galaxies emerged, however, those connections would have become increasingly rare. So while it is possible that LIGO has observed merging PBHs, it is unlikely, according to astronomer Katerina Chatziioannou, a LIGO team member at the Flatiron Institute in New York City and co-author of a study set to appear in the Astrophysical Journal Letters that pegs GW190425 as the product of colliding neutron stars.

Last April, alerted to LIGO’s detection of GW190425, telescopes around the world hunted for a corresponding electromagnetic signal that would typically be expected from the explosive collision of two neutron stars. But the skies remained dark, as they would if a pair of primordial black holes had slammed together. 

https://www.scientificamerican.com/article


Einstein's theory of General Relativity is strictly speaking wrong because it cannot handle quantum fluctuations of space-time. It must be replaced by a theory of quantum gravity. In this week's video I talk about the best options to test quantum gravity.(Sabine Hossenfelder, Physicist,Twitter January 17,2020)

7 Januari 2020

About Proving Method of General Relativity

Evidence of Einstein’s Proving Method

The Evidence 2016The proving method of a theory as suggested by Einstein was recorded at the book ‘The Universe and Dr. Einstein’ written by Lincoln Barnett, published for the first time in London in June 1949. The Preface of this book was written by Albert Einstein himself.


“From these purely theoretical considerations Einstein concluded that light, like any material object, travels in a curve when passing through the gravitational field of a massive body. He suggested that his theory could be put to test by observing the path of starlight in the gravitational field of the Sun. Since the stars are invisible by day, there is only one occasion when Sun and stars can be seen together in the sky, and that is during an eclipse.

Einstein proposed therefore, that photographs be taken of the stars immediately bordering the darkened face of the sun during an eclipse and compared with photographs of those same stars made at another time. According to his theory, the light from the stars surrounding the Sun should be bent inward, toward the Sun, in traversing the Sun’s gravitational field; hence the images of these stars should appear to observer on earth to be shifted outward from their usual positions in the sky.

Einstein calculated the degree of deflection that should be observed and predicted that for the stars closest to the Sun the deviation would be about 1.75 seconds of an arc.Since he staked his whole General Theory of Relativity on this test, men of science throughout the world anxiously awaited the findings of expeditions which journeyed to equatorial regions to photograph the eclipse of May 29, 1919. When their pictures were developed and examined, the deflection of the starlight in the gravitational field of the sun was found to average 1.64 seconds— figure as close to perfect agreement with Einstein’s prediction as the accuracy of instruments allowed.“


The proving method on the theory on general relativity as requested by its founder, Albert Einstein, is not scientific and deeply wrong:

1.Deflection of light is the different angle between true position and apparent position of stars or the different of altitude. In astronomy, true position and apparent position of stars are three dimensionals.

All the photographs be taken of the stars are two dimensionals.

In this case Einstein ignored ‘The Space and Time’ or Celestial Sphere (Celestial Coordinate System), and ignored light refraction as the fundamental concepts in astronomy.

2.All the photographs be taken of solar eclipse ( the Sun and stars ) are photographs of the apparent positions of the Sun and stars. From these photos can not be use to calculate the deflection of light. No one can determine the correct angle of the deflection of light.

In this case Einstein ignored the experimental techniques

3.In astronomy, all calculations to determine the true position and the apparent position of a certain star at the sky is only applicable at a certain time and at a certain place on which such observation is performed.

To compared the photographs taken during an eclipse with photographs of those same stars made at another time is not scientific.

Conclusions: Einstein’s proving method for his hypothesis the deflection of light by the Sun is not scientific and deeply wrong. General relativity has been wrong since the beginning.




Backreaction.blogspot.com



 
Blogger Templates