SEMINAR REPORT ON “WINDOWS AZURE” Submitted By MANIKANT ROY Under The Guidance Of Prof. NishantPachpor In Partial FulfilmentOf The Requirement Of MASTER OF COMPUTER APPLICATIONS (Semester – III) 2011-2012 DEPARTMENT OF MASTER IN COMPUTER APPLICATION (UNDER FACULTY OF ENGINEERING), SINHGAD INSTITUTE OF TECHNOLGY (LONAVALA). CERTIFICATE Mr. MANIKANT ROY of class MCA IInd Roll No: MS45 has given a seminar on “Windows Azure” with satisfactory work as prescribed by “University of Pune” in the academic year 2011-2012 in the Department of Master of Computer Application Place: Lonavala
Date: 10th September 2011 Seminar Guide HEAD OF DEPT. PRINCIPAL [Prof. NishantPachpor] ACKNOWLEDGEMENT I would like to take this opportunity to express my gratitude to all who have extended their full support and guidance for the timely completion for the Seminar presentation. Here I wish to express my sincere and deepest gratitude to my Principal for his constant help, inspiration and encouragement in the successful completion of Seminar. I would like to thank my guide Prof. NISHANT PACHPOR for his sincere, selfless and unique guidance.
I express my sincere thanks to the Staff Members of Master of Computer Application Department who have directly & indirectly extended their kind cooperation in the completion of my Seminar. Last but not the least, I feel proud to express my deep gratitude to my Parents without whose blessings; the present work would never have been completed. Place: Lonavala Date: 10 September(Mr. Mani Kant Roy) CONTENTS Index 1) | Abstract| 5| 2) | Introduction| 6| | a. Overview of Azure Services Platform| 6| | b. Windows Azure| 7| | c. Net Services| 7| | d. SQl Services| 7| | e. Live Services| 7| 3)| Windows Azure | 8| 4)| . Net Services| 12| 5)| SQL Services| 15| 6)| Live Services| 19| 7)| Conclusion| 20| 8)| Bibliography| 21| 9)| Presentation Author| 22| 1) Abstract Cloud computing is the next natural step in the evolution of on-demand information technology services and products. To a large extent cloud computing will be based on virtualized resources. The idea of cloud computing is based on a very fundamental principal of `reusability of IT capabilities`.
The difference that cloud computing brings compared to traditional concepts of grid computing, distributed computing, utility computing, or autonomic computing? is to broaden horizons across organizational boundaries. According to the IEEE Computer Society Cloud Computing is: “A paradigm in which information is permanently stored in servers on the Internet and cached temporarily on clients that include desktops, Entertainment centers, table computers, notebooks, wall computers, handhelds, etc. “
Though many cloud computing architectures and deployments are powered by grids, based on autonomic characteristics and consumed on the basis of utilities billing, the concept of a cloud is fairly distinct and complementary to the concepts of grid, SaaS, Utility Computing etc. In theory, cloud computing promises availability of all required hardware, software, platform, applications, infrastructure and storage with an ownership of just an internet connection. People can access the information that they need from any device with an Internet connection including mobile and handheld phones—rather than being chained to the desktop.
It also means lower costs, since there is no need to install software or hardware.? Cloud computing used to posting and sharing photos on orkut, instant messaging with friends maintaining and upgrading business technology INTRODUCTION a) Overview of Azure Services Platform Using computers in the cloud can make lots of sense. Rather than buying and maintaining your own machines, why not exploit the acres of Internet-accessible servers on offer today? For some applications, their code and data might both live in the cloud, where somebody else manages and maintains the systems they use.
Alternatively, applications that run inside organization on-premises applications might store data in the cloud or rely on other cloud infrastructure services. Applications that run on desktops and mobile devices can use services in the cloud to synchronize information across many systems or in other ways. However it’s done, exploiting the cloud’s capabilities can improve our world. But whether an application runs in the cloud, uses services provided by the cloud, or both, some kind of application platform is required.
Viewed broadly, an application platform can be thought of as anything that provides developer-accessible services for creating applications. In the local, on-premises Windows world, for example, this includes technologies such as the . NET Framework, SQL Server, and more. To let applications exploit the cloud, cloud application platforms must also exist. And because there are a variety of ways for applications to use cloud services, different kinds of cloud platforms are useful in different situations.
Microsoft’s Azure Services Platform is a group of cloud technologies, each providing a specific set of services to application developers. As Figure 1 shows, the Azure Services Platform can be used both by applications running in the cloud and by applications running on local systems. Figure 1: The Azure Services Platform supports applications running in the cloud and on local systems The components of the Azure Services Platform can be used by local applications running on a variety of systems, including various flavors of Windows, mobile devices, and others. Those components include:
Windows Azure: Provides a Windows-based environment for running applications and storing data on servers in Microsoft data centers. Microsoft . NET Services: Offers distributed infrastructure services to cloud-based and local applications. Microsoft SQL Services: Provides data services in the cloud based on SQL Server. Live Services: Through the Live Framework, provides access to data from Microsoft’s Live applications and others. The Live Framework also allows synchronizing this data across desktops and devices, finding and downloading applications, and more. b) Windows Azure
At a high level, Windows Azure is simple to understand: It’s a platform for running Windows applications and storing their data in the cloud. Figure 2 shows its main components. Figure 2: Windows Azure provides Windows-based compute and storage services for cloud applications As the figure suggests, Windows Azure runs on a large number of machines, all located in Microsoft data centers and accessible via the Internet. A common Windows Azure fabric knits this plethora of processing power into a unified whole. Windows Azure compute and storage services are built on top of this fabric.
The Windows Azure compute service is based, of course, on Windows. For the initial availability of this service, a Community Technology Preview (CTP) made public in the fall of 2008, Microsoft allowed Windows Azure to run only applications built on the . NET Framework. The company has announced plans to support unmanaged code as well, i. e. , applications that aren’t built on the . NET Framework, on Windows Azure in 2009. In the CTP version of Windows Azure, developers can create . NET-based software such as ASP. NET applications and Windows Communication Foundation (WCF) services. To do this, they can use C# and other .
NET languages, along with traditional development tools such as Visual Studio 2008. And while many developers are likely to use this initial version of Windows Azure to create Web applications, the platform also supports background processes that run independently—it’s not solely a Web platform. Both Windows Azure applications and on-premises applications can access the Windows Azure storage service, and both do it in the same way: using a RESTful approach. The underlying data store is not Microsoft SQL Server, however. In fact, Windows Azure storage isn’t a relational system, and its query language isn’t SQL.
Because it’s primarily designed to support applications built on Windows Azure, it provides simpler, more scalable kinds of storage. Accordingly, it allows storing binary large objects (blobs), Provides queues for communication between components of Windows Azure applications, and even offers a form of tables with a straightforward query language. Running applications and storing their data in the cloud can have clear benefits. Rather than buying, installing, and operating its own systems, for example, an organization can rely on a cloud provider to do this for them.
Also, customers pay just for the computing and storage they use, rather than maintaining a large set of servers only for peak loads. And if they’re written correctly, applications can scale easily, taking advantage of the enormous data centers that cloud providers offer. Yet achieving these benefits requires effective management. In Windows Azure, each application has a configuration file, as shown in Figure 2. By changing the information in this file manually or programmatically, an application’s owner can control various aspects of its behavior, such as setting the number of instances that Windows Azure should run.
The Windows Azure fabric monitors the application to maintain this desired state. To let its customers create, configure, and monitor applications, Windows Azure provides a browser-accessible portal. A customer provides a Windows Live ID, then chooses whether to create a hosting account for running applications, a storage account for storing data, or both. An application is free to charge its customers in any way it likes: subscriptions, per-use fees, or anything else. Windows Azure is a general platform that can be used in various scenarios. Here are a few examples, all based on what the CTP version allows:
A start-up creating a new Web site the next Face book, say could build its application on Windows Azure. Because this platform supports both Web-facing services and background processes, the application can provide an interactive user interface as well as executing work for users asynchronously. Rather than spending time and money worrying about infrastructure, the start-up can instead focus solely on creating code that provides value to its users and investors. The company can also start small, incurring low costs while its application has only a few users.
If their application catches on and usage increases, Windows Azure can scale the application as needed. An ISV creating a software-as-a-service (SaaS) version of an existing on-premises . NET application might choose to build it on Windows Azure. Because Windows Azure mostly provides a standard . NET environment, moving the application’s . NET business logic to this cloud platform won’t typically pose many problems. And once again, building on an existing platform lets the ISV focus on their business logic—the thing that makes them money—rather than spending time on infrastructure.
An enterprise creating an application for its customers might choose to build it on Windows Azure. Because Windows Azure is . NET-based, developers with the right skills aren’t difficult to find, nor are they prohibitively expensive. Running the application in Microsoft’s data centers frees the enterprise from the responsibility and expense of managing its own servers, turning capital expenses into operating expenses. And especially if the application has spikes in usage—maybe it’s an on-line flower store that must handle the Mother’s Day rush—letting Microsoft maintain the large server base required for this can make economic sense.
Running applications in the cloud is one of the most important aspects of cloud computing. With Windows Azure, Microsoft provides a platform for doing this, along with a way to store application data. 7 .Net Services Running applications in the cloud is an important aspect of cloud computing, but it’s far from the whole story. It’s also possible to provide cloud-based services that can be used by either on-premises applications or cloud applications. Filling this gap is the goal of . NET Services. Originally known as BizTalk Services, the functions provided by .
NET Services address common infrastructure challenges in creating distributed applications. Figure 3 shows its components. Figure 3: . NET Services provides cloud-based infrastructure that can be used by both cloud and on-premises applications. The components of . NET Services are: Access Control: An increasingly common approach to identity is to have each user supply an application with a token containing some set of claims. The application can then decide what this user is allowed to do based on these claims.
Doing this effectively across companies requires identity federation, which lets claims created in one identity scope be accepted in another. It might also require claims transformation, modifying claims when they’re passed between identity scopes. The Access Control service provides a cloud-based implementation of both. Service Bus: Exposing an application’s services on the Internet is harder than most people think. The goal of Service Bus is to make this simpler by letting an application expose Web services endpoints that can be accessed by other applications, whether on-premises or in the cloud.
Each exposed endpoint is assigned a URI, which clients can use to locate and access the service. Service Bus also handles the challenges of dealing with network address translation and getting through firewalls without opening new ports for exposed applications. Workflow: Creating composite applications, as in enterprise application integration, requires logic that coordinates the interaction among the various parts. This logic is sometimes best implemented using a workflow capable of supporting long-running processes.
Built on Windows Workflow Foundation (WF), the Workflow service allows running this kind of logic in the cloud. Here are some examples of how . NET Services might be used: An ISV that provides an application used by customers in many different organizations might use the Access Control service to simplify the application’s development and operation. For example, this . NET Services component could translate the diverse claims used in the various customer organizations, each of which might use a different identity technology internally, into a consistent set that the ISV’s application could use.
Doing this also allows offloading the mechanics of identity federation onto the cloud-based Access Control service, freeing the ISV from running its own on-premises federation software. Suppose an enterprise wished to let software at its trading partners access one of its applications. It could expose this application’s functions through SOAP or RESTful Web services, then register their endpoints with Service Bus. Its trading partners could then use Service Bus to find these endpoints and access the services. Since doing this doesn’t require opening new ports in the organization’s firewall, it reduces the risk of exposing the application.
The organization might also use the Access Control service, which is designed to work with Service Bus, to rationalize identity information sent to the application by these partners. Perhaps the organization in the previous example needs to make sure that a business process involving its trading partners must be executed consistently. To do this, it can use the Workflow service to implement a WF-based application that carries out this process. The application can communicate with partners using Service Bus and rely on the Access Control service to smooth out differences in identity information. SQL Services
One of the most attractive ways of using Internet-accessible servers is to handle data. This means providing a core database, certainly, but it can also include more. The goal of SQL Services is to provide a set of cloud-based services for storing and working with many kinds of data, from unstructured to relational. Microsoft says that SQL Services will include a range of data-oriented facilities, such as reporting, data analytics, and others. The first SQL Services component to appear, however, is SQL Data Services. Figure 4 illustrates this idea. Figure 4: SQL Services provides data-oriented facilities in the cloud.
SQL Data Services, formerly known as SQL Server Data Services, provides a database in the cloud. As the figure suggests, this technology lets on-premises and cloud applications store and access data on Microsoft servers in Microsoft data centers. As with other cloud technologies, an organization pays only for what it uses, increasing and decreasing usage (and cost) as the organization’s needs change. Using a cloud database also allows converting what would be capital expenses, such as investments in disks and database management systems (DBMSs), into operating expenses.
A primary goal of SQL Data Services is to be broadly accessible. Toward this end, it exposes both SOAP and RESTful interfaces, allowing its data to be accessed in various ways. And because this data is exposed through standard protocols, SQL Data Services can be used by applications on any kind of system—it’s not a Windows-only technology. Unlike the Windows Azure storage service, SQL Data Services is built on Microsoft SQL Server. Nonetheless, the service does not expose a traditional relational interface. Instead, SQL Data Services provides a hierarchical data model that doesn’t require a pre-defined schema.
Each data item stored in this service is kept as a property with its own name, type, and value. To query this data, applications can use direct RESTful access or a language based on the C# syntax defined by Microsoft’s Language Integrated Query (LINQ). There’s an obvious question here: Why not just offer SQL Server in the cloud? Why instead provide a cloud database service that uses an approach different from what most of us already know? One answer is that providing this slightly different set of services offers some advantages.
SQL Data Services can provide better scalability, availability, and reliability than is possible by just running a relational DBMS in the cloud. The way it organizes and retrieves data makes replication and load balancing easier and faster than with a traditional relational approach. Another advantage is that SQL Data Services doesn’t require customers to manage their own DBMS. Rather than worry about the mechanics, such as monitoring disk usage, servicing log files, and determining how many instances are required, a SQL Data Services customer can focus on what’s important: the data.
And finally, Microsoft has announced plans to add more relational features to SQL Data Services. Expect its functionality to grow. SQL Data Services can be used in a variety of ways. Here are some examples: An application might archive older data to SQL Data Services. For instance, think of an application that provides frequently updated RSS feeds. Information in these feeds that’s more than, say, 30 days old probably isn’t accessed often, but it still needs to be available. Moving this data to SQL Data Services could provide a low-cost, reliable alternative.
Suppose a manufacturer wishes to make product information available to both its dealer network and directly to customers. Putting this data in SQL Data Services would allow it to be accessed by applications running at the dealers and by a customer-facing Web application run by the manufacturer itself. Because the data can be accessed through RESTful and SOAP interfaces, the applications that use it can be written using any technology and run on any platform. Like other components of the Azure Services Platform, SQL Data Services makes it simple to use its services: Just go to a Web portal and provide the necessary information.
Whether it’s for archiving data cheaply, making data accessible to applications in diverse locations, or other reasons, a cloud database can be an attractive idea. As new technologies become available under the SQL Services umbrella, organizations will have the option to use the cloud for more and more data-oriented tasks. Live Services While the idea of cloud platforms is relatively new, the Internet is not. Hundreds of millions of people around the world use it every day. To help them do this, Microsoft provides an expanding group of Internet applications, including the Windows Live family and others.
These applications let people send instant messages, store their contact information, search, get directions, and do other useful things. All of these applications store data. Some of that data, such as contacts, varies with each user. Others, like mapping and search information, doesn’t—we all use the same underlying information. In either case, why not make this data available to other applications? While controls are required—freely exposing everyone’s personal information isn’t a good idea—letting applications use this information can make sense.
To allow this, Microsoft has wrapped this diverse set of resources into a group of Live Services. Existing Microsoft applications, such as the Windows Live family, rely on Live Services to store and manage their information. To let new applications access this information, Microsoft provides the Live Framework. Figure 5 illustrates some of the Framework’s most important aspects. Figure 5: The Live Framework lets applications access Live Services data, optionally synchronizing that data across desktops and devices
And to set up and manage the Live Services her application needs, a developer can use the browser-based Live Services Developer Portal. Figure 5 shows another aspect of the Live Framework: The Live Operating Environment can also live on desktop systems running Windows Vista, Windows XP, or Macintosh OS X, and on Windows Mobile 6 devices. To use this option, a user groups his systems into what’s known as a mesh. For example, you might create a mesh that contains your desktop computer, your laptop, and your mobile phone.
Each of these systems runs an instance of the Live Operating Environment. A fundamental characteristic of every mesh is that the Live Operating Environment can synchronize data across all of the mesh’s systems. Users and applications can indicate what types of data should be kept in sync, and the Live Operating Environment will automatically update all desktops, laptops, and devices in the mesh with changes made to that data on any of them. And since the cloud is part of every user’s mesh—it acts like a special device—this includes Live Services data.
For example, if a user has entries maintained in the contacts database used by Windows Live Hotmail, Windows Live Messenger, Windows Live Contacts, and other applications, this data is automatically kept in sync on every device in his mesh. (This ability isn’t yet supported in the Live Framework November 2008 CTP, however. ) The Live Operating Environment also allows a user to expose data from his mesh to other users, letting him selectively share this information. As Figure 5 shows, an application can access mesh data through either the local instance of the Live Operating Environment or the cloud instance.
In both cases, access is accomplished in the same way: through HTTP requests. This symmetry lets an application work identically whether it’s connected to the cloud or not—the same data is available, and it’s accessed in the same way. Any application, whether it’s running on Windows or some other operating system, can access Live Services data in the cloud via the Live Operating Environment. If the application is running on a system that’s part of a mesh, it also has the option of using the Live Operating Environment to access a local copy of that Live Services data, as just described.
There’s also a third possibility, however: A developer can create what’s called a mesh-enabled Web application. This style of application is built using a multi-platform technology such as Microsoft Silverlight, and it accesses data through the Live Operating Environment. Because of these restrictions, a mesh-enabled application can potentially execute on any machine in a user’s mesh—a Windows machine, a Macintosh, or a Windows Mobile device—and it always has access to the same (synchronized) data.
To help users find these applications, the Live Framework environment provides a cloud-based application catalog for mesh-enabled Web applications. A user can browse this catalog, choose an application, then install it. And to help their creators build a business from their work, Microsoft plans to provide built-in support for displaying advertising in these applications. The Live Framework offers a diverse set of functions that can be used in a variety of different ways. Here are a few examples: A Java application running on Linux could rely on the Live Framework to access a user’s contacts information.
The application is unaware that the technology used to expose this information is the Live Framework; all it sees is a consistent HTTP interface to the user’s data. A . NET Framework application might require its user to create a mesh, then use the Live Framework as a data caching and synchronization service. When the machine this application runs on is connected to the Internet, the application accesses a copy of its data in the cloud. When the machine is disconnected—maybe it’s running on a laptop that’s being used on an airplane—the application accesses a local copy of the same data.
Changes made to any copy of the data are propagated by the Live Operating Environment. An ISV can create a mesh-enabled Web application that lets people keep track of what their friends are doing. This application, which can run unchanged on all of its user’s systems, exploits several aspects of the Live Framework that support social applications. Because the Live Framework can expose information in a user’s mesh as a feed, for example, the application can track updates from any of the user’s friends.
Because the Live Framework provides a delivery mechanism for mesh-enabled Web apps, viral distribution is possible, with each user inviting friends to use the application. And because the mesh automatically includes a user’s Live Services contacts, the user can ask the application to invite friends by name, letting the application contact them directly. The Live Framework provides a straightforward way to access Live Services data (and don’t be misled by the simple contacts example used here—there’s lots more in Live Services). Its data synchronization functions can also be applied in a variety of applications.
For applications that need what it provides, this platform offers a unique set of supporting functions. 13 Conclusion Cloud computing builds on decades of research in virtualization, distributed computing, utility computing, and more recently networking, web and software services. It implies a service oriented architecture, reduced information technology overhead for the end-user, great flexibility, reduced total cost of ownership, on demand services and many other things. In today’s global competitive market, companies must innovate and get the most from its resources to succeed.
Cloud computing infrastructures are next generation platforms that can provide tremendous value to companies of any size. They can help companies achieve more efficient use of their IT hardware and software investments and provide a means to accelerate the adoption of innovations. Cloud computing increases profitability by improving resource utilization. Costs are driven down by delivering appropriate resources only for the time those resources are needed. Cloud computing has enabled teams and organizations to streamline lengthy procurement processes.
Cloud computing enables innovation by alleviating the need of innovators to find resources to develop, test, and make their innovations available to the user community. Innovators are free to focus on the innovation rather than the logistics of finding and managing resources that enable the in the innovation Bibliography: http://www. wikipedia. org/ http://www. microsoft. com/azure http://msdn. microsoft. com/azure http://microsoftpdc. com http://channel9. msdn. com/pdc2008 http://blogs. msdn. com/windowsazure http://blogs. msdn. com/ssds http://blogs. msdn. com/netservice