Monday, July 23, 2007

Back to the Future in Operating Systems

The PC was introduced as a way to escape the time and cost constraints of mainframe usage. Now the large mainframes are being used again, this time as servers. The operating systems that link networks, such as the internet, have another layer of complexity that goes beyond just linking other nodes. Distributed operating systems use the entire network in an holistic manner. Research distributed computing, and comment -- with URL -- on what you find.

Blog on!

13 Comments:

Blogger Manny said...

What is Distributed Computing and DCE?
The OSF Distributed Computing Environment (DCE) is an industry-standard, vendor-neutral set of distributed computing technologies. DCE is deployed in critical business environments by a large number of enterprises worldwide. It is a mature product with three major releases, and is the only middleware system with a comprehensive security model.

DCE provides a complete Distributed Computing Environment infrastructure. It provides security services to protect and control access to data, name services that make it easy to find distributed resources, and a highly scalable model for organizing widely scattered users, services, and data. DCE runs on all major computing platforms and is designed to support distributed applications in heterogeneous hardware and software environments. DCE is a key technology in three of today's most important areas of computing: security, the World Wide Web, and distributed objects.

http://www.opengroup.org/dce/

2:16 PM  
Blogger Manny said...

Here is an other site that has latest version of BOINC...

orbit@home will use your computer during down time to monitor the impact hazard posed by Near-Earth objects...

http://orbit.psi.edu/

2:27 PM  
Anonymous Anonymous said...

Distributed computing is the type of computing in which different components and objects comprising an application can be located on different computers connected to a network. So, for example, a word processing application might consist of an editor component on one computer, a spell-checker object on a second computer, and a thesaurus on a third computer. In some distributed computing systems, each of the three computers could even be running a different operating system.

One of the requirements of distributed computing is a set of standards that specify how objects communicate with one another. There are currently two chief distributed computing standards: CORBA and DCOM.

Jennifer
source:
http://www.webopedia.com/TERM/D/distributed_computing.html

5:47 PM  
Anonymous Anonymous said...

D/OS manages collection of independant computers makes them appear to the user as a single computer. This multicomputer network are loosley coupled, private memory, and autonomous. Looks like a virtual uniprocessor contain n-copies of operating system, communications via messages n-queues.

http://www-vs.informatik.uni-ulm.de:81/DOSinWWW/DistribSys.html

Junji

6:43 PM  
Blogger Manny said...

Nortel wins Verizon packet switch contract

http://telephonyonline.com/access/web/telecom_nortel_wins_verizon/

Here is a packet switch going in at Verizon IP is becoming Telephony... (Cicuit switching is going a way)

6:52 PM  
Anonymous Anonymous said...

Sun unveils distributing computing software, trying their hands at peer-to-peer or distributing company. Sun recently leaked the news Project JXTA which allows for easy access and was created to smooth the information gathering process across multiple platforms. Users will be able to communicate with each other effectively across anything from PDAs to PCs to servers – which I personally find pretty cool. Project JXTA’s formation is documented on jxta.org.

http://www.internetnews.com/dev-news/article.php/10_752501

dorothy

6:03 PM  
Anonymous Anonymous said...

Distributed computing caught my eye about a year ago when I found out about some amazing projects being undertaken in this manner. The one that caught me eye was the project to map out the human genome. Since then, I have looked into other projects such as SETI and astrophysics. The thought that ANY networked computer can make a difference (like a cell in an organism) by lending its potential to a greater scheme is grandiose.
A grand project is divided into manageable sections, and the host server distributes those packets to each participant. The individual computers churn out their bit of information and deliver their results to the host server. In a semi-automatic procedure, packets continuously are delivered and received. The computing power is only limited by the amount of participating computers.
http://en.wikipedia.org/wiki/List_of_distributed_computing_projects
-Carlos

5:19 PM  
Anonymous Anonymous said...

Most servers now are set to only do one job. That way they do not get bogged down. For instance Proxy servers to allow the net work to go out, File servers to hold data for the uses. But, they can do more than just one job. The servers also can be made bigger as time goes on just by adding more parts or blades.

6:01 PM  
Anonymous Anonymous said...

Most servers now are set to only do one job. That way they do not get bogged down. For instance Proxy servers to allow the net work to go out, File servers to hold data for the uses. But, they can do more than just one job. The servers also can be made bigger as time goes on just by adding more parts or blades.

John

6:04 PM  
Blogger Chad said...

This comment has been removed by the author.

12:44 PM  
Blogger Chad said...

Disadvantages:

If poorly planned, a distributed system can decrease the overall reliability if the unavailability of a node can cause disruption of other nodes. "A distributed system is one in which the failure of a computer you didn't even know existed can render your own computer unusable." -Leslie Lamport

Troubleshooting and diagnosing problems in a distributed system can also become more difficult, because the analysis may require connecting to remote nodes or inspecting communication between nodes.

Many types of computation are not well suited for distributed environments. If bandwidth, latency, or communication requirements are too significant, then the benefits of distributed computing may be negated and the performance may be worse than a non-distributed environment.

http://en.wikipedia.org/wiki/Distributed_computing

12:45 PM  
Anonymous Anonymous said...

Distributed computing is a type of segmented or parallel computing, but the latter term is most commonly used to refer to processing in which different parts of a program run simultaneously on two or more processors that are part of the same computer. While both types of processing require that a program be segmented—divided into sections that can run simultaneously—distributed computing also requires that the division of the program take into account the different environments on which the different sections of the program will be running.

http://en.wikipedia.org/wiki/Distributed_computing

Daniel

3:10 PM  
Anonymous Anonymous said...

This distributed computing thing could be very useful for running huge programs. Huge programs, requiring terabites of processing power. It's not practical on an individual basis, but it can be used for huge scientific studies and any number of large-scale projects. We could see what the 1,000,000,000,000,000th decimal of Pi is, just for fun.

http://en.wikipedia.org/wiki/Distributed_computing

11:14 PM  

Post a Comment

<< Home