Gary McKinnon, a British computer analyst, has failed in his appeal against extradition to the US. McKinnon is accused of accessing 97 US military and NASA computers during his search for information about a possible US government conspiracy to cover up the existence of UFOs. According to the Glasgow-born 42 year old, the computers he accessed were totally unprotected and surprisingly easy to hack. However, the US government says his actions were malicious and the biggest breach of US government computers of all time. McKinnon’s activities allowed him access to 16 NASA computers between 2001 and 2002.
For the record, it is my opinion that McKinon is the victim of his own curiosity. He most certainly is not an organized terrorist wanting to bring down the US government. What’s more the UK has tough laws that he can be prosecuted by, so why is he being extradited to a country where he has never set foot before? Having followed this unfolding story for some years, I feel compelled to mention it on Astroengine.com. This man should not be extradited. The apparent ease at which this individual walked into NASA networks is astonishing; it’s not McKinon that needs to be taken to court, it’s NASA’s Internet security experts who need to be taken to task… Continue reading “US Practices Retroactive Computer Protection: NASA Hacker to be Extradited”
International Space Station (ISS) software security has been brought into question after on board systems were infected by a computer virus earlier this month. This is possibly the first time that a computer in space has played host to a malicious piece of software code, intended to seek out installed online gaming software and then transmit sensitive information it to an attacker. Although the virus in question, known as the W32.Gammima.AG worm, is pretty harmless (after all, I don’t think the astronauts on board play many online games), the infection comes as a surprise. Why hasn’t the ISS got sufficient anti-virus software installed? How did this security breech pass unnoticed until now? The space station may have narrowly dodged the bullet on this one, as if the worm was a little more virulent, there aren’t many network managers between here and low Earth orbit to find a quick solution to the problem… Continue reading “Computer Worm Infects International Space Station”
Early yesterday morning an Alliant Techsystems (ATK) ALV X-1 rocket launched from NASA’s launch facility at Wallops Island, VA. However, only 27 seconds and 11,000 feet into the flight, a launch anomaly prompted the range safety officer to hit the self-destruct button. According to sources, the ALV X-1 was a new type of launch vehicle costing $17 million (including NASA payload).
The ALV X-1 rocket is a sub-orbital design, otherwise more commonly known as a sounding rocket. Intended to carry instrumentation into the atmosphere, rather than into orbit, the ALV X-1 would complete a parabolic flight path, delivering the payload at a predetermined altitude to carry out experiments and parachute to Earth. At 5am Friday morning, this obviously didn’t happen.
The Oort Cloud is a mysterious entity. Located on the outskirts of the Solar System, this hypothetical region is probably the source of the long-period comets that occasionally pass through the inner planets’ orbits. The strange thing about these comets is that they have orbits inclined at pretty much any angle from the ecliptic which suggests their source isn’t a belt confined to the ecliptic plane (like the asteroid belt or Kuiper belt). Therefore, their proposed source is a cloud, acting like a shell, surrounding the Solar System.
OK, so we think the Oort Cloud is out there, and there is a lot of evidence supporting this, but why can’t we see the Oort Cloud objects? After all, the Hubble Space Telescope routinely images deep space objects like stars, galaxies and clusters, why can’t we use it to see embryonic comets within our own stellar neighbourhood? Continue reading “Why Can’t we see the Oort Cloud?”
On writing the Universe Today article Bad News: Interstellar Travel May Remain in Science Fiction yesterday, I couldn’t help but feel depressed. So far, in all my years of science fiction viewing, I have never thought that travelling to another star would be impossible. Although I knew it would be hard, and something we won’t be able to consider for a century or so, I always assumed it could be possible. Well, in a recent meeting of rocket scientists at the Joint Propulsion Conference in Hartford, Connecticut, they concluded that even the most advanced forms of propulsion would require gargantuan quantities of fuel to carry a starship over the few light years to the nearest star. Suddenly I realised I had been looking at the question of interstellar travel in the wrong light; it’s not that it would take a stupid number of generations to get from A to B, we would require 100 times the total energy output of Earth to make it there. Where’s Captain Kirk when you need him… Continue reading “Travelling to Another Star? Unfortunately Starship Fuel Economy Sucks”
The Mayan long-count calendar ends on December 21st 2012. For many reasons, this is a very important event, religiously and spiritually. However, there are a huge number of doomsday scenarios that are being pinned on this day too. Why? Well your guess is as good as mine. This is a very strange phenomenon. We’ve heard “end of the world” theories for millennia; from Nostrodamus, the Bible to the Y2K Bug, but as yet (as far as I can tell) the Earth has not been destroyed. Many historic prophecies have been made deliberately vague to make a future event more likely to match the future prediction by the prophet. That’s fine, I have no problem with a mystical historic figure telling us the world is going to fry at an undetermined date by an undetermined harbinger of doom. But I have a huge problem with modern-day authors publishing scientific inaccuracies for personal gain.
Preparations for the European ExoMars mission appear to be in full swing for a 2013 launch to the Red Planet. This will be a huge mission for ESA as they have yet to control a robot on another planet. Yes, us Europeans had control of the Huygens probe that drifted through the atmosphere of Titan (and had a few minutes to feel what it was like to sit on another planet before Huygens slipped into robot heaven), but it’s been NASA who has made all the strides in robotic roving technology. Although Russia gave the rover thing a blast back in 1971, the roads have been clear for the 1998 Mars Pathfinder Sojourner rover and the current NASA’s Mars Exploration Rovers. Spirit and Opportunity are still exploring the planet (regardless of the limping and stiff robotic arms), several years after their warranty expired. But the Exploration Rovers won’t be the most hi-tech robotic buggies to rove the Martian regolith for much longer.
Some great news from Durban University of Technology in South Africa, their newly built Indlebe Radio Telescope detected its first signal late last month. “On the evening of 28th July 2008, at 21h14 local time the Indlebe Radio Telescope, situated on the Steve Biko campus of the Durban University of Technology (DUT), successfully detected its first radio source from beyond the solar system. A strong source was detected from Sagittarius A, the centre of the Milky Way Galaxy, approximately 30 thousand light years away,” says the statement by Stuart MacPherson. This will be an invaluable resource for students and research projects; a great achievement.
Although this should be the focus of attention, it looks like social bookmarking may have struck again. The DUT announcement was picked up by Digg and the Internet population drew their own conclusions. Interestingly, the Russian mainstream media was listening and interpreted the Internet buzz as proof that an alien radio signal had been detected in the centre of our galaxy… Continue reading “No, An Alien Radio Signal Has Not Been Detected”
In 2006, the International Astronomical Union (IAU) decided to re-classify what constituted a planet. Firstly the candidate must orbit the Sun. Secondly, it must be spherical (none of those asteroid-potato shapes please). Thirdly, it must clear its orbital path of junk. As soon as these three planetary characteristics were specified by the IAU (who is responsible for planet-naming and astronomy nomenclature), Pluto found itself orbiting without a planetary licence and promptly got demoted to a “dwarf planet.” This decision caused two years of arguing and public outcry until the IAU dubbed any Pluto-like bodies as “Plutoids.” This move by the IAU was seen as an affront to a member of the Solar System’s ninth planet, which had over 70 years of proud history (after all, it was thought to be the mysterious Planet X at one point). So next week, the world’s leading astronomers and planetary scientists are gathering in Maryland for a conference addressing the Pluto issue, voicing their frustration at the IAU’s controvercial decision and calling the “Plutoid” classification the Solar System’s “celestial underclass”… Continue reading “Poll: Should Pluto be Re-Instated as a Planet?”
Billions of Euros have been ploughed into the construction of the largest experiment known in the history of mankind. The Large Hadron Collider (officially due to be “switched on” September 10th 2008) will eventually create proton-proton collision energies near the 14 TeV mark by the end of this decade. This is all highly impressive; already the applications of the LHC appear to be endless, probing smaller and smaller scales with bigger and bigger energies. But how did the LHC secure all that funding? After all, the most expensive piece of lab equipment must be built with a purpose? Although the aims are varied and far-reaching, the LHC has one key task to achieve: Discover the Higgs Boson, the world’s most sought-after particle. If discovered, key theories in particle physics and quantum dynamics will be proven. If it isn’t found by the LHC, perhaps our theories are wrong, and our view of the Universe needs to be revolutionized… or the LHC needs to be more powerful.