7.1.7+Compare+a+centrally+controlled+system+with+a+distributed+system.

=SETI@Home =

SETI@home uses distributed to perform data analysis on microwave signals received from space in the search for extraterrestrial intelligence. By installing a program (BOINC) on your computer you are able to become connected to a network of people’s home computers which are in turn connected to a server where the data is collated. Currently there are over 3 million computers connected into this network.

= =

=Climateprediction.net =

Climateprediction.net is a distributed computing project that works towards reducing the amount of uncertanties in climate models. It does this by running hundreds of thousands of different models, using the idle time of personal computers. This project enables a better understanding of how climate models are affected by changes in many parameters.The whole project relies on volunteer computing, running client-side processes in people’s computers. These outputs will then be examined server-side. Climateprediction.net is run mainly at Oxford University in England and has generated more data than any other climate prediction model ever. The project has produced over 100 million model years of data. 

=Bitcoins =

Bitcoin is a digital currency first introduced in 2009, which is described as a peer-to-peer, electronic cash system.

Bitcoin creation and transfer is based on an open source cryptographic protocol and is not managed by any central authority. Each bitcoin is subdivided down to eight decimal places, forming 100 million smaller units called satoshis. Bitcoins can be transferred through a computer or smartphone without an intermediate financial institution.

The processing of bitcoin transactions is secured by servers called Bitcoin miners. These servers communicate over an internet-based network and confirm transactions by adding them to a ledger which is updated and archived periodically. In addition to archiving transactions each new ledger update creates some newly-mined bitcoins. The number of new bitcoins created in each update is halved every 4 years until the year 2140 when this number will round down to zero.

One Bitcoin


**Rosetta@home**

Rosetta@home is a distributed computing project for protein structure prediction on the Berkeley Open Infrastructure for Network Computing (BOINC) platform, run by the Baker laboratory at the University of Washington. Rosetta@home aims to predict protein–protein docking and design new proteins with the help of about sixty thousand active volunteered computers processing at 62 teraFLOPS on average as of October 18, 2011. It is also capable of researching disease, such as Alzheimer's, Anthrax, HIV and malaria.

Since the project is run through BOINC it is available on Windows, Linux and Mac.The requirements for participating in the project are that your computer must have at least 500 mhz of processing power, 200 megabytes of free disk space, 512 megabytes of physical memory, and Internet connectivity.

=Big and Ugly Rendering Project=

2D and 3D rendering for animation artists
The idea of BURP is to use spare CPU cycles on participating computers around the world to [|render] 3D images and animations submitted by the users of the BURP network.

Open Rendering Environment (ORE)
<span style="font-family: Arial,Helvetica,sans-serif;">Every user in the system has access to the content being rendered.The Berkeley Open Infrastructure for Network Computing (BOINC) is an open source middleware system for volunteer and grid computing.



<span style="font-family: Arial,Helvetica,sans-serif;">BOINC is software that can use the unused CPU and GPU cycles on a computer to do scientific computing.

=Pros and Cons of distributed computing=
 * **<span style="font-family: Arial,Helvetica,sans-serif;">Pros ** || **<span style="font-family: Arial,Helvetica,sans-serif;">Cons ** ||
 * <span style="font-family: Arial,Helvetica,sans-serif;">Costs - cheaper than running a supercomputer || <span style="font-family: Arial,Helvetica,sans-serif;">Network Bandwidth - bottlenecking can happen if a lot of data is being transferred at once ||
 * <span style="font-family: Arial,Helvetica,sans-serif;">Performance - if sufficient computers are connected, performance is far greater than a supercomputer || <span style="font-family: Arial,Helvetica,sans-serif;">Security - if a project has potentially sensitive information included it can potentially cause a security breach ||
 * <span style="font-family: Arial,Helvetica,sans-serif;">Reliability - if one or two computers go offline it is not going to have a huge effect on the overall processing of information because all the other computers are still online and running the analysis || <span style="font-family: Arial,Helvetica,sans-serif;">Software complexity - software has to be written for the project and the software may put people off if it comes across too complicated ||
 * <span style="font-family: Arial,Helvetica,sans-serif;">Scalability - it is very easy to increase the amount of processing power due to the nature of the software ||  ||