Checking the sticker on a hybrid vehicle.When I first heard this term, I wondered what was meant by hybrid integration. Seems to me that you either integrate or you don’t.

Perhaps it is an attempt to merge data and application integration with social media, business-to-business, mobile, API and Cloud integration technologies. Just sounds like integration to me. In a previous post I discussed how Cloud is Erasing the Line Between App and Data Integration. Perhaps Hybrid is what they are calling this new world of integration, which is still just integration.

Hybrid integration does make some sense if you compare it to hybrid vehicles. A hybrid vehicle runs primarily on electricity, but it has a small gas powered generator to keep the batteries charged and provide additional power when needed. So two different power sources were put into one car. Perhaps hybrid integration represents putting many integration “engines” together to make the best use of multiple integration “power” types depending on the requirements.

Indeed, we have seen some integration software vendors combining their data and application integration capabilities on the same platform, or providing the ability to move their integration “engine” across platforms. Informatica has done this with their Vibe data integration engine. Write once, run anywhere integration. Other vendors such as Actian, allowing integration logic that has been created on-premise to be easily ported to their DataCloud platform. MuleSoft is offering a similar capability between their on-premise and Cloud offerings.

In other words Hybrid integration is saying you don’t need on-premise OR Cloud integration; data OR application integration: you can have both, wherever you want or need to deploy them. Or maybe it’s a foreshadowing of what many of us have been saying: In the future there won’t be on-premise or Cloud computing, it will just be computing. Cloud is simply another layer that organizations will have in the architecture.

Integration software vendors have also been notorious for having too many products, each with its own purpose and price tag. It’s not unlike going to the dealership only to discover the base model doesn’t have all the options you want, and optional equipment is added to the vehicle to meet your needs. Either way, the price goes up, and the purchasing process becomes complicated.

Fortunately, some integration software vendors are starting to change this scenario. Last week Informatica released version 9.6 of their data integration and data quality platforms. In addition to new capabilities, they have also simplified their licensing. Many of the stand-alone products that used to inflate the price, and make implementation more complicated have been rolled into core offerings. Yes there are still multiple editions but they are now being packaged according to customer maturity and requirements. Yes there are still “add-ons” for additional capabilities but now there are only a few where there used to be many. Hopefully other vendors will follow suit and simplify their licensing.

So next time you go shopping for “hybrids”, take a closer look at the window sticker to see if options have inflated the price tag. And then take a closer look at the fuel economy ratings to make sure it is worth the haggling that may be required to get the deal you want!

Related Info-Tech projects and reports:

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

96004458Attending the Informatica analyst conference last week, my POV of the lines between application and data integration are blurring, if not disappearing, was strengthened. Informatica described an architectural hub-and-spoke data integration pattern that one of their customers implemented using PowerCenter, coupled with persistence and other complementary technology. Informatica is now productizing that pattern in their Data Integration Hub (DIH) solution.

The classic Application Integration (AI) functionalities of publish/subscribe, canonical message models, routing, brokering, and orchestration are being implemented in the Data Integration (DI) world, and blurring the lines between the two integration domains. Data formatting, transformation, enrichment are features that both domains have shared, because at the heart of every application programming interface (API) call is data. More recently, Change Data Capture (CDC) has brought real-time data messaging to the DI world.

The primary functional difference between AI and DI is the interface layer. AI interfaces at the API level whereas DI interfaces at the database level. The primary non-functional difference is the way that data volume is realized. An easy illustration is considering 1000 records of data being sent between applications. An AI scenario would represent those records as individual messages, sent 1000 times. In a DI scenario, those 1000 records would be sent in one message. Therefore, DI scenarios are not ideal when implemented in application integration Enterprise Service Buses, simply because ESBs are not engineered to process large data sets in each interaction.

Q: So if both approaches have their place, why are the lines blurring?

A: The technology they are being implemented in.

If a data integration product can’t call REST and SOAP APIs; or an application integration product can’t interact directly with a database; neither one of the vendors selling those products will get far in today’s IT landscape. Some of the larger vendors handle both approaches, but with different product sets, meaning customers need to spend more on software licensing to handle the two different scenarios. Other vendors focus on one or the other, and have some overlaps in each so they can claim to be all things to all people.

Wouldn’t it be great if data transformations written in an ESB are just as applicable in an ETL job, or vice-versa? Wouldn’t it be great if integration specialists didn’t need to know, and support multiple product sets in their environment that did similar, but different things. Wouldn’t it also be great if we could reduce the number of software licenses that needed to be negotiated, purchased and maintained.

The obvious part of the DIH is to tackle the integration hairball. Even with tooling, data integration has long been point-to-point. Extract, Transform, Load implies one source, and one target. The DIH provides the ability for multiple integration flows to re-use canonical data in a publish/subscribe paradigm, and remove the point-to-point nature of traditional data integration. Now the extract can work for multiple loads, because the ability to implement multiple transforms depending on target is now possible.

My only question is: how long will it take for the Data Integration Hub to evolve into a Data Services Bus (DSB) to run alongside, in or below the Enterprise Service Bus? Hub and spoke integration went the way of the dinosaurs when it became the single point of failure in a distributed environment.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

There has been a lot of buzz of a new concept emerging in the network community– software defined networking (SDN). SDN is glamorized as the network’s latest push towards a more streamlined and cost-efficient solution compared to the physical infrastructure currently dominating the floors of IT departments. Promoters are trumpeting this advancement as an innovation marvel; much like virtualization was to servers. In fact, a key component of SDN is bringing networks to a virtual environment. Despite the hype of SDN giving it much notability, many are still confused about the underlying concept of SDN, the possible complications, and the business value of having an SDN network. Visit Info-Tech’s solution set Prepare for Software Defined Networking (SDN) to guide you through fact and fiction.

SDN is essentially a network architecture where the management, mapping, and control of traffic flow is removed from network devices, and centralized in the network. This separation is said to increase performance, network visibility, and simplicity given it is constructed correctly. However, given SDN’s infancy, a sufficient number of use cases and proof-of-concepts have yet to emerge in the SDN space, leaving organizations wondering if there is any revenue generating or cost saving opportunities. How can they make a sound decision on SDN? It may be too early to make a final decision, but they can start crafting the case and investigate the early movers in the SDN space.

Be prepared to see a shift in networking paradigms because of SDN: hardware to software, physical to virtual, propriety to commodity. Naturally, this will throw off traditional networking staff from their game. But, do not worry, current SDN solutions are still in “Version 1” and future versions may see solutions become friendlier to traditional network practices and concepts. With the attention it is getting from the media and established network leaders, SDN technologies will likely (and hopefully) evolve to mainstream deployment states.

Realize SDN is here. Understand where it came from and how it can help your business. Remember to wait for the SDN space to settle and mature before implementing SDN in your organization. After all, you wouldn’t want your child driving your multi-million dollar car.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

Remember looking up in the sky as a kid? Clouds change shape, grow, shrink, and with your imagination they can look like different things. It doesn’t take long before they stop looking like one thing and start looking like another. Sometimes they can even disappear!

The same can be true in Cloud computing. Integrating Cloud data is like holding onto a kite string while the data is pushed and pulled by the wind. A Cloud DI strategy and plan can help you take control of your data movement to and from the cloud.

Data Integration as a Service (DIaaS) is enabling Cloud DI and taking the On-Premise DI vendors by storm. The costs of running a DIaaS solution are significantly less than their On-Premise counterparts. The functionality of DI On-Premise tooling still outweighs DIaaS, but DIaaS vendors are adding advanced Cloud functionality that doesn’t exist in On-Premise solutions. DIaaS is able to do this simply because of where it lives.

Unfortunately not all is sunny and bright. A dark performance cloud still hangs over SaaS and DIaaS. Data throughput to/from a cloud application via APIs can be up to 60 times slower than what we are used to On-Premise. This means moving 10000 rows of data in a Cloud environment can take up to one hour compared to one minute in an On-Premise environment.

Overall, organizations are becoming more comfortable with the Cloud and its benefits are becoming evident. Expect to see clearer skies in the future as dominant players emerge and interfaces become standardized for easier and faster DI.

For more information, see Implement a Data Integration Strategy in the Cloud.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter