Monday, September 28, 2009

RA Free Time

This is a hijacking of the blog: it is no longer intended for it's original purpose.

This post will be used as a central place for the Enginuity and CNAS RAs to post their free evening times for a meeting.

This central page avoids the hassle of multiple emails to everybody with different times of data conception.

Note that the site is public.

I am free the following nights (this week)
Tue - 9pm +
Wed- 4pm +
Thur-10pm+
Fri- 6pm+
Sat-12pm+

Cody

Thursday, June 4, 2009

Seminar: Heterogeneous GPU Computing


The topic of the seminar was on the practical application of the GPU, a microprocessor fully optimized for common graphical applications, in computing parallel tasks. The GPU can be thought of as an analog to the CPU for any modern video card that a consumer can go out and buy at the store.

A large part of the talk was used to explain the differences between a CPU and a GPU. In both cases, the architecture of the silicon is crafted in such a way to allow for processing of what's known as "atomic" statements. As the name might imply, atomic statements are instructions that the CPU is able to process naturally, without any need for breaking the instruction down into more basic instructions. Atomic statements are therefore "basic" instructions.

A general purpose CPU like the Intel Core i7 is crafted so that it can, with a little extra work, compute just about any problem that a piece of software or hardware requires. A GPU on the other hand, like the NVidia GTX 295, is crafted so that it can quickly work on mathematical problems like matrix math. This is due to the fact that the graphics being displayed on a monitor, whether it be the Windows desktop or a rendered scene from a video game, is the result of numerous mathematical calculations on geometric objects. So instead of the problem being broken down into basic parts on the Core i7 (which adds considerable computational overhead), the problem can be immediately processed as is on the GTX 295.

For parallel computing tasks, such a math-oriented environment is ideal. Many parallel computing problems require the hardware to spend enormous amounts of time crunching floating-point numbers. Adding to the computational power is the amount of cores on a modern GPU; the current NVidia Tesla GPU contains 240 cores dedicated to processing. Naturally, the Tesla GPU outperforms the Core i7 almost threefold in computationally intensive tasks.

Of course, the disadvantage to the GPU implementation is its utterly abysmal performance on crunching general computational. The CPU is designed for a general case in mind, after all. An additional note that wasn't touched upon in the seminar is the misconception regarding a GPU's performance compared to a CPU. It seems that the amazing threefold advantage can only be achieved by problems termed "embarrassingly" parallel, or problems that are (laughably) easy to break down into parallel computing components.

I guess number-crunching fits that bill quite well.

Monday, June 1, 2009

Longer Lasting Digital Memory


In an era where a sizable portion of the world's literature is stored in some digital form or medium, the relative shelf life of such mediums leave a lot left to be desired. The irony is that while efforts in "archiving" the printed medium of centuries long past were intended to preserve such works, the printed medium will more than likely outlast the digital collection it has been copied into. This is due to theoretical limitations in utilizing semi-conductors for such tasks.

Researchers have discovered an alternative to the traditional semi-conductor approach by using some modern techniques in the field of nanotechnology. Digital information is usually stored in the medium as a machine-readable set of 1's and 0's. By storing an iron nanoparticle inside a carbon nanotube in one of two positional states, one can induce the iron to move between the states in the presence of electricity. This can effectively represent the machine-readable 1's and 0's required by the system.

Greater storage space can be achieved by packing components of the digital medium into dense clusters. This relationship is directly proportional: the greater the density of the medium given a physical space, the greater the storage offered. Unfortunately, semi-conductor placement also have an inversely proportional relationship to its shelf-life: the greater the density of the medium, the less the shelf-life will be. The nanotube system is believed to be relatively stable in this regard, as nanotubes can be packed as densely as needed while yielding the same shelf-life: over a billion years.

Nanoscale Reversible Mass Transport for Archival Memory [Nano Letters, ACS Publishing]

Sunday, May 31, 2009

A planet like ours?

Geoff Marcy, one of the worlds top astronomer and leading planet finder, has recently announced his discovery of a planet which might be similar to Earth. The planet in question is approximately 41 light-years away and 55 times bigger then Earth. So far, this is the closest resemblance to Earth that has been discovered. I find it funny how Marcy has already received two calls from the Vatican asking Marcy on details of the newly discovered planet. It has long been debated why the planet Earth was placed where it is and is no wonder why the Vatican are interested in it.

Although a planet like Earth has been discovered, it doesn’t seem like we will be seeing it any time soon. With our current technology, it would take a couple hundred of years to travel 41 light-years. It’s pretty amazing that we’re able to observe something that far in the first place. Despite this fact, three programs are being put in place to help achieve this goal one day. It’s amazing how far we’ve come in terms of the exploration of space. It’s interesting too how Marcy has fully planned out what to do if there ever is a situation when we humans interact with extraterrestrial beings. It really is something special to be able to live through the making of history.

Sunday, May 24, 2009

Cell Phone Viruses


Cell phones are so popular now that over 80 percent of Americans own one. Yet it was unclear for some time why cell phone users had yet to be attacked by a major virus outbreak. Albert Laszlo-Barabasi for Northeastern University set out to find out why this was. Barabasi and his team of researchers have found that what has protected cell phone users so far is “…fragmented market share.” (ScienceDaily). The research was done by collecting data from six million cell phone users. However, Barabasi explains that cell phones will not be safe from virus outbreaks forever. He calculates that once a single market share is large enough, cell phones will likely be vulnerable to attack. It seems like this might happen soon especially since the ownership of smart phones is increasing 150 percent every year. The infection of one phone can affect others that come in contact with it quite quickly. Marta Gonzales, one of the researchers involved, explained that, “…a mobile phone virus can spread by two mechanisms: a Bluetooth virus can infect all Bluetooth-activated phones in a 10-30 meter radius, while Multimedia Messaging System (MMS) virus, like many computer viruses, spreads using the address book of the device. Not surprisingly, hybrid viruses, which can infect via both routes, pose the most significant danger.” (ScienceDaily). The researchers also found that the spread of viruses through Bluetooth could eventually affect all mobile phones, but the spread is slow. For this reason, they feel there will be enough time to develop anti-virus software for phones. Viruses can spread much quicker through MMS, but there is still a small amount of phones with the technology.

Source:
http://www.sciencedaily.com/releases/2009/05/090521161531.htm

Nationwide Academic Experiment


In an attempt to clean up America’s education system, the US government is going to choose a small group of states to participate in an intensive training program – which includes a 5 billion dollar budget. As stated by the nation’s top education, I agree with him when he proposed a nation wide educational program. America is one country, not 50 individual ones. So why do we have 50 separate systems for education; all which differ slightly. However, California’s chances are not looking to great in the overall view. Although many superintendents are pushing for California’s enrollment in the program. However, with budget cuts which lead to less program and instructional hours, it won’t look like California will be helping their lower performing students succeed.

I find it very ironical that states, which need financial help to boost performance, are required to show proof of their success – which is not evident due to the lack of funding in the first place. This is the exact scenario that California is in right now. California desperately needs the extra funding to help the state get back on its feet, but most likely won’t be able to receive it due to the lack of funding in the first place. It’s unthinkable that Governor Schwarzenegger is proposing to reduce the school year by a week. A whole week doesn’t seem a lot, but for a student just entering high school, by the time s/he graduates, that will be one full month of instructional teaching that s/he loses. In my opinion, the government should split the budget according to the state’s population and distribute the funds to all of them.

Source: http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2009/05/22/BAR617PKQK.DTL

Google for Music


Electrical engineers at UC San Diego are working on their music identification system. What adds a new twist to the formula is their efforts at taking a general, genre based approach at music selection. For example, when a user searches for "easy listening", the system will attempt to identify every song in its database that it determines to be "easy listening" and return the results as a recommended list to the user. The key point, of course, is to understand that the software (and to some extent, hardware) will be able to determine the genre by analyzing the digital information that represents the music. 

Apparently, the system had trouble with Queen's Bohemian Rhapsody. I can't blame it.

The system will need initial parameters in order to determine what types of music fit into what types of genres. The researchers originally paid students to listen to songs and label them manually, but switched to a new model that involved members of the social-networking site Facebook playing a series of games which accomplishes the same objective. Named Herd-It, the game involves users listening to music, identifying instruments, and finally labeling the songs in order to earn points in a high score table. The closer a player's submission matches the normative answers among all players of the game, the higher the score.

The researchers are also saving money. Hiding a research effort in the guise of a game allows them to utilize a human computer farm at the cost of a few hours of programming.

From A Queen Song To A Better Music Search Engine [Science Daily]

Sunday, May 17, 2009

Fire Ants


Fire ants are a pesky sub species of ants which sting to kill instead of just biting like normal ants. They inject a venom which gives a similar effect to being stung by a wasp. Within the past few years, the population of fire ants in the United States has exponential increased. Many researchers say that it’s specifically coming from Mexico and bringing an even stronger venomous sting.

Fire ants, unlike normal ants, are widely accepted as a non beneficial species. Thus, control has been attempted and strived for in order to manage the species’ population. A discovery has been made where a special fly named the phorid flies. These flies are able to easily kill off fire ants through injecting larva eggs into the fire ant’s brain. The larva eats away at the fire ant’s brain and severs the head off by releasing an acid that eats away at the neck. While still in the head, the larva will then hatch, using the hollowed head as shelter, after about 40 days. This process, although sounds great, is an extremely slow process for the amount of fire ants already in existent. One of the more interesting details about the process is that the larva, after injected into the brain, is actually able to control the fire ants. This is one of the few forms of the concept of zombies.

A group of professors have released two separate batches of these phorid flies in the United States last year in hopes of helping control the fire ant population. If the flies fail, there are scientists currently working on alternative methods of killing off the fire ants, such as fungi and viruses.

Source: http://news.nationalgeographic.com/news/2009/05/090515-zombie-ants-flies.html

Saturday, May 16, 2009

Life's Rocky Road: The History of Life on Earth


The Lecture Life’s Rocky Road: The History of Life on Earth was given this past Thursday by Nigel Hughes. The lecture’s theme was that even though we cannot travel into the past to observe things as they actually occurred, we can use things occurring in the present to understand the past. Hughes explained this by pointing out that the number of tree rings on a tree trunk is equal to how old the tree is. The age of a rock can be found by examining the type of fossils that are found within it. Hughes then went on to list the earliest forms of life that we currently know of. The earliest fossils that have been uncovered are 3400 million years old. However, these fossils are on a microscopic scale and cannot be seen with the naked eye. Since then, fossils of other types of bacteria and organisms have been found that help explain the process of growth in plants and animals. An organisms’ fossil estimated to be 570 million years old, is thought to be a very early example of sexual reproduction. It is still unknown whether the organism was a plant or an animal, however it was large enough to be seen with the naked eye. Those that discovered the organism believe it reproduced sexually because of the fossils that were found. The fossil showed groups of large spores next to a cluster of smaller spores. The larger spores seem to represent the older generation while the smaller ones appear to represent the younger generation. Hughes lightly touched on the fact that organisms on earth have extremely similar genes even those with drastically different body types. This is evidence for our common ancestry with other organisms. The lecture concluded with the topic of anthropogenic global warming. The temperatures on earth are rising at an alarming rate and promise to continue to do so. Although all we have to do to continue our run on earth is to adapt, the reality is that most organisms do not adapt in time. Only time will tell whether humans will survive or not.

Monday, May 11, 2009

Do Biofuels Really Help Save the Environment?


The production of ethanol will be increased as a result of a federal requirement. The Energy Independence and Security Act of 2007 (EISA) requires that the use of ethanol be augmented by the year 2015. “The Energy Independence and Security Act requires the United States to produce 15 billion gallons of corn-derived ethanol annually by 2015 and 16 billion gallons of fuel from cellulosic crops, such as switchgrass, by 2016” (ScienceDaily). However, several professors have found that this increase will have negative consequences. Water and its quality will be affected in particular. First of all, a large quantity of water is necessary to produce ethanol. The researchers involved in the recent study found that corn grown in Nebraska, “…would require 50 gallons of water per mile driven, when all the water needed in irrigation of crops and processing into ethanol is considered” (ScienceDaily). Fuel produced from sorghum grown in Nebraska would require up to 115 gallons of water per mile. The researchers also pointed out that increased demand for ethanol will most likely create more water pollution. This is because of the larger amount of pesticides that will be used to help grow enough crops and because of soil erosion. Dr. Joel G. Burken, one of the researchers involved, knows that it is unlikely that the EISA will be revoked. Instead, the researchers hope that these findings will lead lawmakers to consider other consequences mandates may have. It is important that we help reduce the emission of greenhouse gases, but we must be conscious of any “environmental trade-offs” that may result.





Sunday, May 10, 2009

UCB Hacked







One of the servers at The University of California Berkeley (UCB) has been compromised by hackers. This was not discovered until April 9th of last month and has been going on since October 6th of 2008. It’s amazing how the hackers were able to maintain control of the servers for such a long time of such a big educational institute. It was discovered, by tracing and technological forensic experts, that the hackers originated from China.

The hackers were discovered when a system maintenance worker discovered a message left behind by the hackers. Apparently, it is common for hackers to leave behind secret hidden messages that tell the victim they are being hacked, similar to provoking or playing with the pray. I don’t really understand why they would want to give away the fact that the victim is being hacked since the hackers could have probably maintained control of the server if there were no traces left behind.

It appears that similar incidents has happened before where information has been stolen from UCB, however those cases usually were resolved before anything too major happened. This time, it seems like the thefts got away with 97,000 social security numbers of the staff, facility, and students there. It is also interesting to note that the notifying email sent out to the students and staff warning them of the security breach was not sent out until two days after the discovery of the hacking. Something as major as identity theft should be reported immediately to the victims.


Source: http://tech.yahoo.com/news/ap/20090508/ap_on_hi_te/us_tec_uc_data_theft

USB 3.0

This entry details the new USB 3.0 standard, with an emphasis on it’s 2.0 standard.

The new USB standard, published at the end of last year and expected to first hit markets at the end of 2009 or early 2010, is a significant improvement over the previous and current version, 2.0. USB 3.0, dubbed “Super-Speed” (as opposed to the previous “High-Speed”), has several important features that will allow it to be adapted into the marked. Probably one of the most important features is backwards compatibility: All current 2.0 devices are usable with the new system. This is accomplished by making two sub systems: one is the 3.0, and one is the 2.0. The two systems are pin compatible, so they share the same USB port. Any 2.0 will be usable with 3.0, and vice versa.

Probably the most notable difference with the new standard is the high speed: 5Gbits/s! Compared to the previous 480 Mbits/s, the speed increase is substantial. The increase is accomplished by adding four more wires to the cable. This will make 3.0 cables substantially larger than the previous 2.0 version, enough so that the different cables can be visually separated.

Another notable change is in the power delivery capabilities of the system. In the prior version, 2.0, devices were guaranteed a maximum of 500 ma, with more negotiable. The threshold has now been raised to 900 ma, with more negotiable. The increase is in recognition of the fact that more and more embedded systems are using USB to charge batteries.

Source: “USB 3.0 SuperSpeed” by José Luis Rupérez Fombellida, published in the May 2009 elector.

Cody

Sunday, May 3, 2009

Invisibility Cloaks


In science fiction stories, the power to become invisible is not rare. Perhaps one of the most well known stories where this occurs is Harry Potter. Wizards can disappear from view simply by covering themselves with an invisibility cloak. What was once considered only possible in the realm of science fiction is now closer to becoming a reality. Scientists at the University of California, Berkeley have developed what they call a “carpet cloak” which allows objects placed underneath it to become undetectable to the eye. The carpet, made from nanostructured silicon, remains visible itself. However, the bulge created by the object hidden underneath appears flat as a beam of light hits the surface of the carpet. Xian Zhang believes that this development can lead to “…manipulating light at will for the creation of powerful new microscopes and faster computers.” Zhang and his team had previously found that complex metamaterials can be used to bend light backwards. The team had attempted to use these metamaterials to achieve invisibility however it could not be done because the metal elements take in a large amount of light. Eventually the team began working with diaelectric materials and created the new “carpet cloak.” Although invisibility of objects currently occurs for light between 1400 and 1800 nanometers is wavelength, Zhang remains hopeful about what can be achieved in the future. “…with more precise fabrication this all dielectric approach to cloaking should yield a material that operates for visible light - in other words, true invisibility to the naked eye.” (ScienceDaily).




Proofreading CPS

Cyber-physical systems (CPS) encompasses all instances of interaction between a physical object and some computational portion of that object in the modern world. A specific example of a CPS would be the interaction between an airplane and its collision detection system.

As with any piece of computer software, a given design with an apparent flaw will ideally be fixed to working order in while still in the design stage. If the flaws proceed into the manufacturing state, then the only way to find those flaws would be to employ trial-and-error. In the case of an airplane collision detection system, such trials would be expensive, time-consuming and ultimately impractical.

A research team at Carnegie Mellon have developed a piece of software that, provided with some initial parameters, will attempt to find a counterexample which illustrates a flaw in a given design. For example, an airplane on a collision course with another airplane will recieve evasion instructions from its collision detection system in order to avoid a universally fatal plane crash. The software, after recieving the parameters of the collision detection system, will attempt to find a scenario where the two planes will crash into each other.

Employing typical "brute-force" methods to this problem would be a computational nightmare, if not outright impossible. This is due to the infinitely many variables present in the real world that can affect the outcome of the system. The method employed in the software attempts to bypass this difficulty by extrapolating "differential invariants", or basic pieces of the problem that never change regardless of the variables. Using the many differential invariants in the collision problem, the software will attempt to piece together a counterexample or prove that the design is sound when no such counterexample can exist.

The latter is obviously the difficult part.

Method For Verifying Safety Of Computer-controlled Devices Developed [Science Daily]

Magazine Rundown

I thought that I’d give a rundown of some of the magazines that I read regularly. This isn’t all of them, but a summary of the ones that I have on hand. There isn’t any particular order.

Seed: This is a science magazine. I like this magazine for the somewhat new age articles that they run, with enough in-depth analysis to be interesting but not overwhelming.



Circuit Cellar: This is a computer and electrical engineering magazine. This is a rather advanced publication, without too much emphasis on beginning students. It focuses on specific architectures and applications (within an article). It’s a good magazine for learning some of the more traditional techniques.


Servo: This is an (in my opinion) amateur robotics magazine. The articles are clearly made for a beginner, without very much depth. It’s interesting in a “getting back to basics” sort of way, but not something that you can learn too much from.



American Scientist: This is my favorite science magazine. It covers all sorts of science and engineering, and has in particular two columns that I look forward to: Computing Science and Engineering. The magazine presents excellent analysis on current topics.



IEEE Spectrum: This magazine is the IEEE general interest magazine. It lacks the significant depths of the more specific journals, but it generally has interesting articles of a tech forum category.



Elector: This electronics magazine is well formatted, and perhaps that’s why I like it. The articles are generally fairly in depth (a little less than Circuit Cellar), but it’s a well published magazine.



Scientific American: This magazine is a bastion of the science magazines that I read. It’s a little too Pop-Sci to be truly reputable, but it has intermediate level articles that with some dedication can be well to read.



Nuts and Volts: I like this electronics engineering magazine for the well written and informative articles. There isn’t anything overwhelming here, but the contributors generally provide a good analysis of modern electronics (the breadboard kind), and it does have a column that occasionally deals with the Propeller that I like.


Communications of the ACM: I really like this Computer Science magazine. It has (basically) journal articles that are cleaned up to the density, and republished with more pictures. This is a good magazine in which to learn a specific topic in computing.



Make Magazine: This magazine (really almost a soft cover book) is a hobbyist technology magazine. It’s almost what Popular Mechanics was back in the day, before it was corrupted by bad articles. Make has many unique and interesting projects that, while I’ll never build most of them, are inspiring to do something of my own. Add in the culture that they nourish, and it’s an enjoyable read. Think Portland.

Cheers,

Cody

US Pollution

A recent survey has ranked Bakersfield, California as the most polluted city in all of America. Originally ranked third a just last year, Bakersfield now has surpassed both Pittsburgh and Los Angeles. Surprisingly, when interviewed, local citizens of Bakersfield say that the smog in the city actually contributes to their way of life. Located and surrounded by farming lands in Central Valley, the large amount of smog has numerous amounts of contributing factors. Since it is surrounded by vast amounts of farmland, most of the machinery used to manage the farms, such as tractors and planes, emit large quantities of smog. The mist from the fertilizers and pesticides, which greatly protect our crops, also contribute greatly to the increasing pollution. Not only does the farming lifestyle contribute to the smog but so does the natural environment and geographic location of the town. Because it is located within a valley, it’s surrounded on three of its four sides by large mountains. This essentially boxes in the town and forces all the bad air to the ground – and keeps it there. Also, due to the mountains, there is little to no wind to carry the polluted air away.

Although there have been plans to help reduce and prevent pollution, there are limitations that the town does not have control over. It’s also known that cleaning up the air and adding regulations and the sort costs a large amount of fees. Even though corporations like Apple and Google are adapting green policies, cities around America will continue to pollute itself.

Monday, April 27, 2009

The Hephaestian




This is the sample cover for our team journal. When we begin publishing, we will have (as expected) custom covers based on the issue content.

Sunday, April 26, 2009

Online Privacy


Due to a creative interpretation of German copyright laws, various record label companies are now able retrieve personal information from internet service providers and take legal action. One of the most prominent demonstrations of this recently occurred last week. The popular website Rapidshare has been forced to hand over logs of their uploader's IP addresses. Ranked as the 14th top site from Alexa, Rapidshare is a dedicated one-click file hosting service. Already, a man has been apprehended for uploading Metalica’s new album Death Magnetic to Rapidshare a day before its schedule release date.

However, this does not only apply to music labels. The movie industry and various other right holders also has started making their move on taking advantage of this new interpretation of the law in Germany. This can extend to not only services like Rapidshare but also the bittorent community in Germany. Especially after the verdict of The Pirate Bay trial, bittorent has been a hot topic lately. Numerous bittorent communities have already closed, most of which were voluntarily, due to the fear of being prosecuted.

Ironically, Lars Ulrich of Metallica admitted to downloading one of his own albums from bittorent. A former anti-piracy advocate, Ulrich now admits to the changing times of the internet. Although he says this after he tries piracy for the first time, Ulrich finds it strangely bizarre.

In our current age of internet technology, privacy online is just as important as in real life. Not having full control of where our personal information goes is a serious issue that is bound to grow larger as the internet progresses. Hopefully, the German law will be revised and fixed up to protect online users.

Sources:

http://www.alexa.com/topsites
http://torrentfreak.com/rapidshare-shares-uploader-info-with-rights-holders-090425/
http://torrentfreak.com/metallica-frontman-pirates-his-own-album-090305/

Sandbots

This is a synopsis of the article “March of the Sandbots” by Daniel Goldman, Haldun Komsuoglu, and Daniel Kodischek appearing in the April 2009 IEEE Spectrum.



Sand is a difficult place for a robot to maneuver. It requires a different kind of design from the traditional. Consider the three most common forms of transportation today: wheels, tracks, and legs. Each of these fail in a sand environment. In the case of the tracks and wheels, they dig into the sand and freewheel. In the case of the legs, they delve deep into the sand, and movement becomes prohibitively expensive. The authors present an alternate method of movement across sand: a crescent shaped ‘leg’ that spins. Six per vehicle, the legs propel the robot by pushing it along, in a walking pattern of two legs and one (tripod style). The authors tested their vehicle in a unique setup to simulate sand without the mess. They filled a tank with poppy seeds, and placed air nozzles on the bottom so that they can agitate the sand to just the density that required.



The interesting thing about this project is that the authors are designing their robot using the bionic principle. They’ve figured out that there is a problem of robots failing in sand, and have looked to nature for an example. The crab was a particular inspiration, as well as the zebra-tailed lizard.

One thing that I would have liked to see addressed in the article is what a robot that is either a sled design or a mono-wheel design would do in such an environment. Humans have used sleds to travel overland for thousands of years, so that might be a good starting point for another robotic design. Perhaps a pair of sleds where one pushes the other forward, and then they switch roles?

Error and Quantum Sensors

Continuing along the quantum computing tour path, researchers have been able to harness a glaring shortfall of quantum computing methodology and morph it into something practical. The premise is so simple that you would have to wonder why the research didn't take place much sooner.

As with conventional computers, quantum computers are vulnerable to random noise in their quest to process data. The popular design is to use and alter the fundamental unit of quantum mechanics, the atom, to represent data in a meaningful and ultimately useful way. Unfortunately, random noise for a quantum computer can be anything from the heat of the sun to the movement of electrons in the air that the quantum computer is sitting in.

A immediately practical application for such levels of sensitivity is to use as a core for an extremely tiny, atom-sized sensor. Such "quantum sensors" would have the ability to detect natural occurrences several orders of magnitude smaller than what would currently be considered "undetectable". The article mentions, as an example, tiny magnetic waves emanating from the ocean floor that may indicate untapped oil reserves.

The Oxford researchers named their system the "quantum cat", after Schrödinger's thought experiment involving a box, a cat and a vial full of lethal poison. Perhaps the most interesting (and ironic) part of the story is the paradigm shift required in the manufacture of quantum computers.

"Many researchers try to make quantum states that are robust against their environment," said team member Dr Simon Benjamin of Oxford University’s Department of Materials, "but we went the other way and deliberately created the most fragile states possible."

Quantum Cat’s 'Whiskers' Offer Advanced Sensors [Science Daily]

Puijila Darwin


Discoveries made in Canada’s Artic explain just how seals went from land-based to marine-based animals. The fossil of an unknown web-footed carnivore was recently uncovered by Natalia Rybczynski. The animal has been found to be very similar to an otter, however it has a skull that more closely resembles a seal’s. The animal has been dubbed Puijila Darwin. The word Puijila refers to a young seal. The animal also bears Darwin’s name because it exemplifies what he wrote about in the Origin of Species. “A strictly terrestrial animal, by occasionally hunting for food in shallow water, then in streams or lakes, might at last be converted into an animal so thoroughly aquatic as to brave the open ocean.” Puijila is a part of the pinniped group which also includes seals, sea lions, and walruses. It was found to be living near fresh water lakes over 20 million years ago. It is an example of a transitional animal since it did not have flippers like modern pinnipeds, but had webbed feet. It was not well designed for living in water, the fossil suggests that Puijila swam using its four legs unlike modern seals and otters. Both animals depend on only their hind legs to swim. There is evidence that Puijila Darwin lived at the same time as the Enaliarctos, which was the earliest pinniped known prior to the discovery of Puijila. This means that though a new body plan “refined for superior swimming” had emerged, Puijila Darwin had not yet died out. The discovery has been extremely important to understanding how pinnipeds evolved into what they are today.

Sunday, April 19, 2009

Cybernetic Security

A recent bill brought up at the senate would allow a single entity nearly entire authority on private Internet networks. Introduced on April first, the Cybersecurity Act of 2009 would essentially allow for a single overlooking power that would report directly to the president. The bill would allow for the government to search and request data from secure and private networks without regards to any policy. Written in fifty-one pages, the bill outlines when a private network is considered threatening to national security. Included are regulations that will be imposed on private networks and systems. The list mentioned specified software, licensing, and testing on the severs.

Most people however, agree that our digital information security is one of the most important issues that we are facing now. Greg Nojeim of the Center of Democracy and Technology stated that the bill is extremely vague and would greatly broaden powers in favor of the government while another one argued that the American public must have their private information protected. In our day and age, our digital selves are just as important as in real life. It was pointed out that cyber security on certain networks such as people’s electric, banking, health, and traffic records are prone to an attack. These attacks, if carried out, would have a large effect on the American public as a whole. Not only would private information be destroyed or leaked but the trust in the American government would greatly be damage. Despite his beliefs, Nojeim admits to the importance of the advantages of the bill but strongly urges Congress to modify the bill.

USC Researchers Develop 3D Display


3D displays were held in the realm of science fiction, such as the iconic scene in Star Wars with R2-D2 displaying a prerecorded 3D video feed. After nearly two decades since the creation of the first Star Wars movie, researchers headed by the graphics lab at USC demonstrate a 3D display system using a rotating mirror.


As the mirror spins at a predetermined speed (measured in Hz, effectively rotations per unit of time), a camera situated above the mirror shoots images onto the rotating mirror in rapid succession, matching the Hz of the mirror. Utilizing the persistence of vision, the rotating mirror effectively tricks the viewer into thinking that he is seeing a floating image in the center of the display. Even more, the viewer can walk around the display and see the image from that perspective, essentially creating a pseudo-three-dimensional representation of the object. The objects can either be cooked up inside the graphics lab, or can be recorded on-the-fly for a live-feed version of the object.


The idea is just as simple inside the black box, as the algorithms developed for the display are tasked with simply displaying the right image at the right time. The complexity of the display is solely in its calibration, which includes the timing of the system and the synchronization of the mirror to the camera. Since the illusion of depth is dependent on the persistence of vision, even a slight glitch in timing may very well prove to be a jarring experience.


Friday, April 17, 2009

Schizophrenia and Disconnectivity in the Brain


Most people have been fooled by an optical illusion at some point. The Hermann Grid has people seeing gray shadows in what are actually pure white intersecting lines. The Ebbinghaus illusion tricks people into thinking that one circle is larger than another one, when they are actually the same size. In the past, studies have found that people with schizophrenia are immune to some optical illusions. The latest studies made in Germany and the UK have found that Schizophrenics are immune to the Hollow mask illusion. This illusion causes people to view a concave face as convex (it is seen as a normal face when it is actually sunk in). Scientists believe that Schizophrenics are not fooled by the illusion,”…because their brain disconnects ‘what the eyes see’ from what ‘the brain thinks it is seeing” (ScienceDaily). These two parts of the brain have difficulty communicating the “bottom-up” process where the eyes collect visual information and the “top-down” process where the information is interpreted to each other. This is known as dysconnectivity. The findings of the study also help explain why people that use cannabis are often immune to optical illusions when under the influence of the drug as well. Cannabis can make the two parts of the brain that collect and interpret information experience difficulty communicating. This is because THC, more commonly known as Cannabis, is a psychomimetic drug. The ingredient cannabis resin creates psychotic-like effects in its users. For this reason, the person experiences temporary dysconnectivity.
ScienceDaily: