Monday, November 30, 2009

PS3 gets Gaming Integration with Facebook

Facebook will be integrated into an upcoming PlayStation 3 firmware update.
Sony has confirmed earlier reports that the social networking website will arrive on the console in the near future.
Firmware 3.10 is to allow PSN accounts to be linked to the service, updating trophy and purchase information to news feeds.
The integration will also let developers publish additional information related to the game and set specific criteria such as the regularity of updates.
The upgrade will further streamline photo viewing functionality and modify the look of the PSN friends listing.
The announcement came as Facebook and Twitter support was released on xbox 360.

Security App Uses 'Keystroke Biometrics'

Boston-based Delfigo Security has introduced information systems user authentication software that uses "keystroke biometrics" among other factors for verifying user identity.
Keystroke biometrics capture a user's typing patterns, such as how fast a user types and how long keys are pressed. The DSGateway product is being marketed to financial institutions, health care organizations and e-commerce firms. The cardiology department of Children's Hospital Boston is the vendor's first health care customer.
In addition to usernames and passwords, the product captures 14 other identification profiles. These include typing patterns and system and browser identifiers, among other profiles. The vendor also offers third-party single-sign-on software with the authentication product.
Delfigo Security was founded in July 2008 and DSGateway is its first project. It is available in remotely hosted or client-hosted versions. More information is available at delfigosecurity.com.

Tuesday, November 17, 2009

The Fastest Computer In The World

A Cray XT5 supercomputer dubbed “Jaguar” has claimed the spot as the world’s most powerful computer, taking away the crown from an IBM supercomputer called Roadrunner.

Located at the Department of Energy’s Oak Ridge Leadership Computing Facility, Jaguar posted a 1.75 petaflop per second (petaflop/sec.) performance speed running a benchmark called the Linpack, according to the newest edition of the Supercomputer Top 500 list. The newest edition of the list was released at the SC 09 supercomputer conference this week in Portland, Ore.

This was actually Jaguar’s third attempt to take the title away from IBM’s Roadrunner, which has held the world’s top spot for the last 18 months. This time Jaguar roared ahead with new six core Opteron processors from Advanced Micro Devices. Jaguar utilizes nearly a quarter of a million cores and has a theoretical peak capacity of 2.3 petaflop/sec.

IBM’s Roadrunner system, which is installed at Los Alamos National Laboratory in New Mexico, was the world’s first supercomputer to reach petaflop/sec. speeds. It claimed the top supercomputer crown in June 2008 and held the spot in the November 2008 and June 2009 Top 500 roundup. This time around, however, Roadrunner only managed a 1.04 petaflop/sec. performance, down from 1.105 petaflop/sec. in June 2009, which was attributed to a repartitioning of the system.

Kraken, another upgraded Cray XT5 system, installed at the National Institute for Computational Sciences at the University of Tennessee, claimed the No. 3 position with a performance of 832 teraflops/sec. No. 4 was the IBM BlueGene/P supercomputer located at the Forschungszentrum Juelich in Germany at 825.5 teraflop/sec. And rounding out the top five was the new Tianhe-1, which means River in Sky, located at the National Super Computer Center in Tianjin, China. Tianhe-1 is being used for research in petroleum exploration and simulation of large aircraft designs.

Overall, the world’s supercomputers are getting faster. The entry level of the latest Top 500 list moved up to 20 teraflop/sec. from 17.1 teraflop/sec. The last system on the latest list would have ranked No. 336 on the previous Top 500 list just six months ago.

The full list can be viewed here: http://www.top500.org/


IBM Mashes Information and Analytics to Support Information Accessibility

IBM announced a new version of IBM Mashup Center that will simplify the assembly and publishing of information across a workforce and across the Internet for consumers, constituents and customers. This IBM Software wide product that is designed to enable business and IT to work together in the harvesting of information across the enterprise including content, documents, reports and integrate capabilities that can help individuals interact with the information but also information and services across the Internet. The growing volumes of information in organizations is not easily leveraged into business efficiently can be integrated easily for use across applications.

The IBM Mashup Center uses the IBM iWidget framework that provides a common technology wrapper around sources of information and capabilities that can be easily managed in a library where they can be shared and used for a wide range of needs. These iWidgets can be integrated together into composite widgets and can be easily assembled without programming or traditional application development cycles.

Reference: www.imformation-management.com

Business Intelligence v/s SOA

Business intelligence enlightens an organization about Information but it has to be relative to the business process Statistics (why things are happening), Trends and Forecasting / Predictive Modeling and finally Business process optimization. all require an understanding of the data in context. Most BI solutions or data warehouses are information integration solutions that bring together disparate information sources into a single well documented solution (with metadata) and are often updated daily. A BI solution requires a managed information architecture / environment which includes:

  • Basic Metadata – reporting and operational
  • Common Communications – semantic consistency, dictionary
  • Data Architecture – data models, taxonomies, ontology
  • Trusted Data (data quality) & Usage (data governance)
  • Information Delivery - reporting and I even included data services

Not all BI solutions require or use a single integrated information source. An example of this is an organization that utilizes an ERP (like SAP, Peoplesoft, etc.) – although, they may still wish to develop customized reports using a BI package.

Services based architectures (including web services) required that we break the solution or application into small business chunks that support the processing of these services. Information about the processing becomes a natural output of this and this information occurs and is collected in real-time. Building Services Oriented Architectures [SOA] also required managed information architecture environment which similarly included:

  • Metadata and Meta-process information (enhanced beyond reporting)
  • Common communications – processes (business rules, choreography, orchestration, event-correlation), messaging (mechanism – MOM, delivery)
  • Enterprise Architecture – component or flexible implementations,, defined layers of abstractions
  • Data layer abstraction
  • Governance – architecture, security, SLA’s, applications, portfolio, etc.

Next came a common integration point known as Master Data Management – something that required both disciplines. I think that this was the impetus for the convergence of SOA and Business Intelligence. It required the real-time integration of multiple applications (services based) but also the trusted data foundation (information architecture). It was a given that both solutions were developed using the business architecture as the source of requirements, and that both required structured communications (metadata) and governance (data at a minimum).

The IT community has sensed that there is a need for an overall integration architecture or method that we employ to the integration of systems and data and these terms we use should be consistent. Just as the builder, electrician and plumber can all communicate, we should learn from them.

Monday, October 26, 2009

CIS 8020 Assignment 2 AO Google Static Maps

A company in California manufactures and sells laboratory instruments and employs over 40 sales and service personnel, who are each based in various geographic parts all over the country. Sales people visit their local hospitals and laboratories, while service people stay on call and move from client site-to-site to perform maintenance and repairs on existing units. The headquarters in California must constantly be aware of the whereabouts of the sales/service in the field, so that when an emergency arises, HQ can properly allocate the job to the field person who happens to be the nearest.


Using Google's Static Maps API, the headquarters can update whereabouts of the people in the field, however often they decide would be necessary, map-pointing points of emergency against that of where a nearby field person is.
For instance, the field service rep. serving the Atlanta territory serves the states of Georgia, northern Florida, Alabama, and Mississippi. When an emergency arises and the person serving North Carolina is not available, the management may look at where reps currently are and make the decision to dispatch whoever is nearest at the moment.


View Larger Map

This API was chosen because of the simplicity it offers, and the company's needs are simple enough as well to be well served by the API. Being simple to implement and use offers the ability for quick updates, which is important since the locations for the people in the field are constantly changing.

Sunday, October 25, 2009

CIS 8020 Assignment 2 - BM - Google Static maps

A map solution to keeping track of your offices.

A small business is quickly expanding and needs to provide a map to their city or office locations to potential clients. Their office locations are in growing areas so roads, buildings, and other pieces of infrastructure are changing often. The directions on their website need to keep changing, and showing a picture of the building is becoming difficult with the often changing surroundings. The decide to use a static map from Google which is easy to integrate, and is maintained by Google. 

Here is the map that would be shown for their Berkley office:


A light example of how you could select between offices is here:
http://coldmails.com/assignment2/Default.aspx
.

This shows how easy it would be to switch between maps in 4 different locations.

The largest advantage to this solution is the maintenance by Google Alleviates the worry that many of the companies would have if they were going to try to maintain their own maps. The solutions is easy to implement and will automatically update. The speed is very quick, and the user experience is not affected at all given that you can adjust the image size to your liking in the API requests.


 

CIS8020 Assignment 2 PC Gold Vs Crude prices API Tracker

Scenario:



An affluent client has asked for a comparison between Gold & Crude Oil prices from an Investment Management Firm.The Investment Management firm decides to provide static charts for average values for the latest three quarters and refrains from real time up-gradation as it uses third party data.



Solution:



Google spreadsheets & visualization API is a good method to execute the query and can be easily be incorporated into any website.The website has to collect input data (Crude Oil Prices & Gold prices) and display in the requisite format.The prototype is as follows:




The source data for this prototype can be viewed here:



http://spreadsheets.google.com/pub?key=tX572Gre0yQkL0LGzZZpehg&single=true&gid=0&output=html



Such an API saves time & cost of programming and is easy to maintain.This not only allows the website to customize data as per specific client needs but also allows the scope for real time up-gradation capabilities,if required.The entire model can easily be incorporated in the website and offers better service quality.



Disclaimer:Past Performance is not indicative of Future performance.The data is taken from third party websites and does not represent any research or ideas or advice from the publisher.

Friday, October 9, 2009

Cool Mash-up websites

Guys, take a look at this video it demonstrates some really cool mashup websites.
It also shows the google maps mash-up which Max showed us during his presentation on tuesday.
My favourites are Map my buddies (mashup of facebook and google maps API) and Programmable web, which gives a whole list of mash ups.

Can Mashups replace ERP? Enterprise Mashups

So! Everyone's talking about Mashups? It was also a hot topic on this weeks class discussion!!
But defining a mashup in the context of the enterprise is another story.
Let’s takean example of a sophisticated enterprise mashup: connecting your SAP ERP data with your Oracle/Siebel CRM data and two sets of online third-party demographic information while maintaining single sign-on through your global LDAP server, then sharing the mashup with your management. And doing it without IT’s involvement.

it is reasonable to expect that enterprise mashups will be both a tool for Project Managers and a part of the new class of ‘Web 2.0’ applications will be involved in implementing. As a tool, enterprise mashups can greatly improve the real-time decision-making capabilities.
Mashup adoption in industries like financial services and government has already begun; some integrators are already learning what it means to deal with this new style of ‘Web 2.0 mashup application’ with requirements like ‘loosely-coupled’, ‘user-driven’, and ‘browser-based’.

Enterprise mashup solutions provide users with the ability to mash data from a data warehouse/mart as easily as any other data source. Mashups can easily be constructed from transactional ERP/CRM/SFA systems and newer interface technologies like SOA and RSS services. Mashups can make these disparate technologies easy to dynamically combine for real-time information solutions.

Tuesday, September 29, 2009

Data Portability: A New Answer to Integration for a Web 2.0 World

Today, a broad variety of Internet-based applications such as Flickr and Facebook are now available. They allow users to maintain profiles, participate in communities, create and post content, and perform a wide variety of other tasks. Some applications even mashup business logic and services from different sources. Tim O’Reilly, a well-known Silicon Valley investor and blogger, has defined this new phenomenon as follows:
Web 2.0 is the business revolution in the computer industry caused by the move to the internet as platform, and an attempt to understand the rules for success on that new platform. Chief among those rules is this: Build applications that harness network effects to get better the more people use them. (This is what I've elsewhere called "harnessing collective intelligence.")
A basic level of standardization is often the first step and is an important requirement for more powerful data integration services to emerge. In other words, data portability is the precursor to data integration. Today, new organizations and committees aimed at standardizing representation of data on the Internet are already beginning to make data portability a reality.
New Standards Emerge
Dataportability.org is one of the more interesting groups driving data portability today. Originally, this organization focused on putting users in control of their own data, as it accumulated across various Internet-based services. However, the organization now has a broader goal, and champions overall standards for Web data portability. Let’s take a quick look at some of them.
OpenID. OpenID aims to be a free and simple way to use a single digital identity across the Web, and has been adopted by Yahoo, AOL, Google, Microsoft, MySpace, Orange, France Telecom and many other providers. More than half a billion people now have an OpenID identifier, which should further encourage active adoption of this standard.
OAuth. OAuth is defined as a simple way to publish and interact with protected data on the Internet.
RSS. RSS stands for Really Simple Syndication. It’s a dialect of XML that is often used to share content on the Internet, signified by the orange RSS icon you’ll spot on numerous sites.
CMIS for content integration and SSE for file sharing.
OPML. OPML stands for Outline Processor Markup Language. It’s closely related to RSS, and is often used to share RSS subscription lists amongst readers and aggregators, which typically allow users to export and import RSS subscriptions and Web bookmarks as an OPML file.
A Data OASIS
Dataportability.org is just one of many organizations promoting standardization of data on the Internet. OASIS (Organization for the Advancement of Structured Information Standards) is another prominent player and has sponsored many other standards, most notably CMIS or Content Management Interoperability Services.

As content and services providers continue to embrace the Web as their platform, the same data portability and integration problems that emerged in the enterprise will appear again. New Internet-based tools and services that can connect disparate data islands and providing transparent and secure access to Internet data regardless of where it is stored will be essential to making Web content and services truly accessible - and standardizing data-related format and protocols will be a critical first step.

Interesting new standards such as OpenID, OAuth, CMIS and others have already begun to emerge. These standards are a huge step forward, but much more must be done to improve these standards and ensure they are supported by new and existing Internet-based services. Until this is accomplished, data integration in the Internet will remain a big issue. However, a big problem is usually a big opportunity in disguise, and many companies will undoubtedly rise to the challenge and fill this important need.

Tuesday, September 15, 2009

Integration at Home - SOA

Overview on Home Integration

It all started when Microsoft came out with Universal Plug and Play (UPnP) capabilities to allow users to dynamically plug-in new devices such as mouse, keyboards, external storage, etc. As broadband started being widely adopted and people started having more than one PC at home, the network/wireless hubs also became popular within households. This required the device manufactures to adopt a standard for dynamic discovery, control, event notification, presentation, etc. As the PCs and all their peripherals were already supporting UPnP, it became the choice by default. UPnP is the architecture for peep-to-peer networking of PCs, networked appliances and wireless devices. It is a distributed, open architecture based on TCP/IP, UDP and HTTP.

Next came all the gaming consoles with internet connections and, as the networks hubs (both wired and wireless) were the gateways to the internet, these devices also started supporting UPnP. After the first major release of the gaming consoles, the subsequent releases started adding additional multimedia capabilities. These same capabilities were also available on PCs (and Macs) the storage requirements for households increased exponentially to store all the picture, music and video, personal documents, etc. Today these storage devices also come with multimedia and Web servers to present the content on demand to the user. Once again, the device manufactures leveraged UPnP not only to discover the devices and services but to control access to remote drives, files and control the audio and video play.

At the same time, the mobile phone devices had implemented Bluetooth a low-power, short-range wireless personal area network for exchanging information between devices such as mobile phones, laptops, PCs, digital cameras and video game consoles. As this was short range, most of the Bluetooth devices were the clients interacting with a server such as PCs and game consoles sending commands. These servers would, in turn, leverage UPnP if they needed to access or control remote content.

Until recently, the home automation technology was very expensive and not affordable by most of the households. With the introduction of ZigBee and Wibree low cost chips, the home automation industry has taken off. Home automation is not just for environment control such as light and temperature control, it has also expanded to control audio and video devices – all based on the IP networks. Basically the chip technology allows for control from the server to the client such as light switch, garage door and temperature controller. The servers leverage UPnP to control access to remote services, those not directly connected to the server the user is interacting with.

Basically UPnP is the underlying protocol that provides peer-to-peer networking capabilities within the household. However, due to some limitations of this protocol, Microsoft along with some of its partners proposed a new device profile Web services (DPWS) standard to replace UPnP. As this is not currently backward compatible with UPnP, all vendors are expected to support both UPnP and DPWS until there is convergence.

Independent of the protocol, the home integration is now real which unifies the PCs, gaming consoles and home automation from a console on any device.

Peer-to-Peer Network Overview

The concept of the P2P network is pretty straight forward. Whenever a peer joins a network, it multicasts its availability which triggers interaction between it and the other peers to acknowledge its existence and maintain the peer map. In addition, the peers may exchange device description, control information such as commands available. Once the peer network is established, the clients on any of these devices could control any remote device/resource. All these remote devices/resources could also be referred to as services available on the peer network.

One of the major limitations today is that these protocols used at home only support access from within the local network, i.e., the user cannot access these services from outside the house. The vendors are now actively working on expanding the standard to support remote secure access to these personal home integration services.

Relevance to the Enterprise

With all the major enterprise software vendors backing the services component architecture (SCA), there is now a single standard for all the services and their dependencies. The SCA defines only the metadata to describe the service and its dependencies with most of the vendors initially leveraging this in their design/development tools to enable architects model the service assembly. However, this does not yet address the IT operations administration nightmare of reconfiguring the impacted services whenever the deployment model changes.

The vendors are expected to address this administration overhead in their subsequent issues by developing/incorporating the SCA container. In this approach, the development teams focus on developing the services and invoking external services with the right interfaces but not focus on the bindings. In addition, IT operations no longer need to reconfigure all the services. The SCA container dynamically discovers all the relevant services using the similar approach used for home integration. The UDDI standard is not sufficient to enable dynamic discovery and some of the current Open Source SCA container projects have started leveraging peer-to-peer technologies such as Jini and JXTA. This dynamic discovery shall make it simpler and easier for IT operations to manage all the service dependencies and shall also reduce the need of an enterprise service bus.

Similar to social networking site, which drove the innovation of Web 2.0 solutions in the consumer space and now impacting the enterprises, the integration-at-home approach should be adopted by IT operations to manage and monitor all the service dependencies in production. As WS-Discovery is leveraged by DPWS for dynamic discovery of services at home, it would make sense for the ISVs to simplify the convergence between personal and work services for the user.
reference: informationmanagement.com

Tuesday, September 1, 2009

Presentation for week three: System Integration Models

Here was our first presentation where we talked about the general topics and styles of System Integration. We also talked about a lot of topics that will be discussed later by other teams. If you are on one of those teams and we stole your thunder, we would just like to say: "Our bad, beer is on us."

The primary topics discussed are the main Integration Models, and the styles in which you can implement them. They are:

1. Presentation Integration
     - Very simple and straight forward.

2. Data Integration
     - Shared Database
     - Maintain Data Copies
     - File Transfer

3. Functional Integration
     - Distributed Object Integration
     - Message-Oriented Middleware Integration
     - Service-Oriented Integration

Had a lot of great discussion and interesting talking points. Thanks everyone!

Here is a link to the presentation, hosting courtesy of our professor: http://gzheng.cis.gsu.edu/teaching/cis8020/files/sysintteam03.ppt

Tuesday, August 25, 2009

Test Blog 1

This is the blog of 

Ben Matthews

Andrew Oh

Prachi Rathore