wpc1It’s Microsoft World Partner Conference 2014 – or should I say, Microsoft Cloud World Cloud Conference 2014…Cloud. The theme at this year’s Microsoft WPC event in Washington D.C. is clear—the cloud is the future of IT as we know it, and Microsoft will not be left behind.  If anything, they’ll be out ahead, way ahead, so far ahead that they just might lose you along the way.

To be clear, there is nothing wrong with the vision Microsoft is putting out there.  They paid the price for their inability to react to the tablet threat on the Windows OS dominance in the desktop and laptop market, and are not about to let that happen again.  What remains to be seen, though, is will partners, and customers for that matter, be able to absorb such sweeping change?

One of the most interesting revelations the cloud push has enabled is the extension of not just partner opportunities, but truly collaborative relationships with vendors you’d sooner consider competitors of Microsoft.  Oracle, SAP, and Salesforce (announced last month) were all prominently featured as part of the integrated cloud eco-system built upon Azure that extends functionality beyond just Microsoft’s stack, and into the realm of competitive/complimentary solutions.

The cloud push extends beyond just the Azure offering (which according to Microsoft now has at least some footprint in 57% of the Fortune 500 and boasts more regional data center clusters than either AWS or Google with 17) and right into the core of the partner network.  Microsoft today announced numerous changes to Cloud competencies including waiving first year fees for Silver partners, adding more Azure and Office365 competencies, and making a commitment to provide significant support to their Cloud Solution Providers.

End-user customers are also going to feel the effects of the cloud wave.  COO Kevin Turner made it clear: Microsoft understands the productive consumer of technology is not hooked to one device in one place, and must be enabled to work wherever, however, and on whatever device they choose.  The Office 365 approach follows that user-centric model, but still no mention of how companies who have elected to take a perpetual license approach can benefit in the same way without exponential licensing costs.  The answer is probably the same as the answer to the people who are running Windows Server 2003 when it reaches end of life—just move it to the cloud already!

The highlight of the morning for me, though, was Turner’s big reveal of the 5th pillar to add to the big 4: Mobility, Social, Big Data, and of course, the Cloud.  In grandiose fashion, security was unveiled as the fifth piece of the puzzle…Now where have I seen that before? (you can tell it was 2012 because we were only talking about 10G Ethernet!)

Microsoft 2014
Info-Tech Research Group 2012

Until tomorrow…


Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

2e85ba9 Whether you work in information technology, marketing, operations, or customer loyalty, you’ve probably noticed by now that there’s a trinity, often unholy, surrounding project execution in today’s corporate environments:

  1. Project management (PM)
  2. Project management office (PMO)
  3. Project portfolio management (PPM)

PM concerns itself with the day-to-day tasks needed to complete a given project by following the procedures that have been established to make successful delivery a repeatable process. Project managers then corral their teams to achieve specific results within the timeframe and budget allocated to each project.

Meanwhile, the PMO is tasked with coordinating and assigning people to various initiatives, ensuring that process and procedure are followed properly, and of course budgeting. That means managing the portfolio of current and future projects. Doesn’t it? Not always, I’m afraid.

PPM: Misused and Misunderstood

PPM is the most important yet least understood of all three interrelated project disciplines. Why? Because PPM – when done effectively – is the practice that tracks project progress, assesses project risks, juggles staff resourcing, manages capacity, and supplies performance metrics to stakeholders and executives who want to know what’s going on with their project.

In short, PPM maximizes the throughput of value-add projects at the lowest possible cost.

While all of this sounds great in theory, things go drastically wrong when the PMO focuses solely on PM and ignores PPM in part or in entirety. That’s when some steering committee steps in and demands that the PMO ‘apply governance’ and slap on a software solution to manage the project portfolio. Conventional wisdom says this is the correct approach, so it must be right, right?


Figure 1. Companies That Have Invested in PPM Solutions

n = 250 (source: Info-Tech Research Group, 2013)

Of the 16% of organizations that actually purchased a PPM solution, Info-Tech estimates that no more than 10% of the installed base is being used to its full potential.

Just to be clear, it’s not wrong to apply governance and/or software to help solve a problem. It is very, very wrong, however, to apply governance and/or software on top of processes and priorities that are fundamentally flawed to begin with. Project management is no exception.

A poorly-run PMO is one that cannot provide current or accurate status reporting on active projects, doesn’t have an intake funnel, doesn’t track the success of recently closed projects, nor can understand the organization’s capacity for more projects. (This is especially true in agile-based project environments, where the emphasis is on today’s sprint and not much else.)

How a PMO Goes Bad

If your PMO is focusing exclusively on last month’s estimate-vs-actuals, but can’t give you an answer when you ask “How is the project doing right now?” then you have a serious problem on your hands that no software or committee in the world can fix.

Or if you ask the PMO “Do we have enough capacity to complete this important, newly-approved project?” and all you get in return are blank stares, then you have an even bigger problem. Like, oh, I don’t know, a PMO that isn’t doing its job?

The issue of misaligned priorities is pervasive and utterly detrimental to project success. In fact, thanks to some primary research being done at Info-Tech around PPM, we routinely hear from PMO leaders who believe they are wasting 30% to 70% of the hours available for project work by:

  • Starting the wrong projects
  • Cancelling projects too late in the game
  • Failing to assign work to available resources
  • Pushing incomplete projects through too quickly.

An average project waste of 50% is scary, especially when you consider that the PMO’s real job is to reduce this waste, not over-manage project minutiae to the point where waste is created.

A PMO without insights into its own project portfolio is a blind, thrashing creature. It must therefore relearn how to govern itself before it attempts to micromanage the project team leads on how they write their charters or fill out tracking templates.

Closing the PPM Gap

So what is the PMO actually supposed to be doing? For starters, it should be focusing on what’s truly important about project governance: managing the portfolio. In an ideal world, here’s what it should look like at a high level:

Figure 2. Holistic PPM Framework


And yet, many PMOs lack the ability to track available funding or project benefits, which as you can see above are the two cornerstones of PPM. So from a maturity standpoint, PPM success is always going to be about managing the portfolio holistically than it is about managing the projects themselves.

But simply following a framework isn’t going to be enough all by itself: the organization will need a rightsized PPM strategy to go along with it. Above all, the PMO must ensure it covers the following bases:

  1. Improve the transparency of the project portfolio by clearly outlining tangible and intangible benefits, project categorizations, and a thorough review of the PMO’s current methodology.
  2. Assess the project portfolio’s impact on – and alignment with – different departments, business units, regions, and so on. How does one project’s approval affect each facet of every other project?
  3. Optimize the project intake/approvals process for the purposes of clarity, efficiency, consistency, visibility, and the viability of IT funding requests based on their own merits.
  4. Monitor portfolio performance and stakeholder satisfaction. It’s all about tracking at this point, and it’s only here that the PMO should even start to consider a software solution to assist it.

Bottom Line

I am not condemning all PMOs, of course. I am sure there are many highly functional units out there. However, I have heard from far too many clients – both quantitatively and qualitatively – that their own PMOs are simply dysfunctional. So obviously there is a problem.

Perhaps you have some influence with your company’s PMO. Perhaps you work in one yourself. Whatever the case, it must be communicated to all stakeholders that an effective and productive PMO is one that makes managing the portfolio of projects its top priority.

Want to Know More?

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

176591517Unless we re-think Enterprise Architectures (EA), and bring it into alignment with the agile development movement, we may be in for a collision between two cultures.

Here at Info-Tech world headquarters, we are working on a series of project blueprints on agile development (more about that below). I’ve been involved quite a bit in quality reviews of these blueprints and have started thinking about the impact of agile development on Enterprise Architecture. I see the implementation of EA as being at odds with the Agile Manifesto. Here’s the manifesto, in case you haven’t seen it in a while:

We are uncovering better ways of developing software by doing it and helping others do it.

Through this work we have come to value:

Individuals and interactions over processes and tools

Working software over comprehensive documentation

Customer collaboration over contract negotiation

Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the left more.

Enterprise Architecture, as currently implemented in large organizations, tends to favor the items on the right over the items on the left. The stock in trade of EA consists of processes, tools, comprehensive documentation and creating plans for others to follow. As agile development grows within large organizations, something clearly has to give.

Reinvent, Oppose, or Meet in the Middle?

Can Enterprise Architecture reinvent itself as an agile practice or must it, by its very nature, oppose agile development as yin to its yang? Or should agile and EA meet somewhere in the middle?

Those of us with many years of IT experience sometimes shudder at the thought of the agile manifesto. With the gift of 20/20 hindsight, we look at the tangled mess of systems built in a typical IT shop over the last 30 years and wonder if a little planning might have saved the day. If we were guilty of anything, it was being too responsive to our business users.

Instead of building that fifteenth billing system because the user’s product line was special and needed to launch yesterday, maybe we could have insisted on re-purposing an existing system and saved the company from the burden of duplicate customer databases, nineteen different customer identifiers and revenue reports that don’t reconcile. Do we really want to become even more responsive and adding to the spaghetti that is already strangling the typical IT department in a huge technical debt that soaks up 80% of the budget?

On the other hand, our business users are so fed up with the slow pace of IT that they are taking matters into their own hands and implementing cloud solutions left and right. If we insist on rigorous central planning, we’ll end up managing yesterday’s IT department and wondering where all the new applications went.

The Road to Re-invention

Clearly, we need to re-invent Enterprise Architecture to retain its best features and avoid technical debt while adapting to the new realities of agility and the cloud. How on earth do we do that?

Here are some of my early thoughts:

  • We need to focus on high-level principle statements and vision diagrams that can communicate the essence of our target state quickly. Our audience now consists of people who have grown up with sound bites, infographics and video games. If we can’t communicate to this audience, we are lost before we begin.
  • Our revision cycles have to speed up and adapt. The ponderous Application Development Methodology (ADM) that forms the heart of TOGAF doesn’t cut it anymore. Our architects need to become part of the agile teams, contributing to the development of working software and constantly feeding back to agile EA teams who are actively maintaining the integrity of our target state vision.
  • The various core practices of architecture (business architecture, technical architecture, data architecture etc.) need to be practiced simultaneously in agile sprints, not in sequence.
  • The notion of architecture governance, based on review gates and review boards for large waterfall projects, must be discarded and replaced with a “roll your sleeves up” approach that embeds architects with agile teams so that they can practice governance in real-time. This will avoid costly rework when teams fail to meet standards and adhere to guidelines.
  • The holy grail of a fundamental repository of re-usable artefacts must finally be set to rest. Artefacts rarely, if ever, get re-used and no one ever trusts the state of current documentation as an accurate representation of the actual state of the application. Let’s get over it and be practical.

Please note that I’m not advocating any abdication of EA responsibility. We need a clear central vision of what we’re building to avoid technical debt and ensure that systems mesh together well. However, the way we accomplish this must change with the times. We need to be able to clearly and quickly communicate our vision and become active evangelists for it, contributing to real work as we encourage our team members to do the right things.

Agile Sprint
Click For Infographic

Those Info-Tech project blueprints on agile I mentioned above include Agile Practices that Work. This blueprint is aimed at organizations implementing agile technologies for the first time and is a no-nonsense practical guide to implementing scrum. A second blueprint called Increase the Agile Footprint, currently in production, is focused on optimizing the application of agile methods and gaining momentum in the organization for the roll-out of agile to more projects. The third and final blueprint will be aimed at scaling up agile methods to take on large projects and will be based on the methodology created by Scott Ambler, called Disciplined Agile Development.

We are also thinking about creating a blueprint on this subject of re-inventing EA for the Agile Movement. I welcome your thoughts and experiences. Has anyone out there implemented agile enterprise architecture in some fashion?

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter

info_divergence_thumbAs I move from the ivory tower of Genetics to the practical, business related advice that Info-Tech Research Group gives clients on their IT environment I’m amazed at how many parallels I see in the needs and the solutions in all kinds of human endeavors.

Much like my own thinking on this, many Enterprise Content Management (ECM) vendors are convinced the core issue is about information movement not what it is stored in (i.e. a word doc VS a excel). In an environment where the location where information can be stored are expanding; the easiest way is to control the access points.

The key problem is that neither vendors or IT departments have grasped the need to understand individual needs and challenges to use and move information.

This presents problems that most organizations fail to recognize. Information management requires explicit participation by workers. The success of a group or enterprise level information management platform completely dependent on its alignment with individual work needs.

There are a lot of mistakes that scientists have already made managing complex information. Businesses can save a lot of time and effort by understanding the key problems.

The CIO’s key to a long lasting strategic information management plan is: Can you bridge the gap between individually important and corporate important information? The rest of the blog is insight into where the key differences are.


The technology to solve the problem is here but acknowledgement of the actual problem is not

For me this comes back to a practical problem that I had as a graduate student. My PhD was on how genetics relates to brain development. As you can imagine this was (and still is) a daunting problem to investigate. My research involved a variety of time staged images; reams of excel workbooks on cell counts, word docs on behavior and gene expression data from multiple sources.

As I reflect on where businesses are in their information management compared to the state of the art in genetics in the late 90s there are plenty of parallels the corporations should understand to stop making the same mistakes that were made 20 years ago in genetics.

Business parallel No.1
Big data has always existed.

Performing research requires a wide set of data sources organized for context and value. Managing the information in different filetypes and databases was a big data problem before the phrase existed. In reality I, as an individual had no problem keeping track of all this data and looking at each piece and doing the analysis on each piece. I had very good notes(metadata) and file naming conventions (classification) to ensure that I could easy find the file I needed.

I was in effect the content management system (represented by the prism); I defined what pieces of information had value, I understood where the files were stored and I controlled access.

Business parallel No.2
Understanding the information and analyzing it is the problem.

The problem was synthesizing the separate content based analyses into a cogent, single, piece of information i.e. something that can be shared with others in a common language that allows other to build their own actionable plan.

Business parallel No.3
Combining individual knowledge and group actions is poorly done.

The transfer of information from individual to group is difficult without that intimate knowledge of the motivation and use of that information to other users.

Any scientist reading my dilemma from 15 years ago can probably relate-and so can anyone else that uses and presents information as part of their job. The deep understanding of information by an individual often blinds them to the subtleties in their knowledge. This makes it difficult to recognize the need for a common language and process of knowledge transfer. The reality is that technology can only solve the problem if people recognize the problem and WANT to move towards a common framework…….knowledge workers always balk at the idea of following any other person’s process.

Business parallel No. 4
The blind spot to idiosyncratic behaviors by knowledge workers

Knowledge workers often believe that their creativity stems from chaos and freedom. In truth it is a lack of structure that kills creativity by allowing the mind too much space to move within. The advent of the online databases by NIH from genomic, chemical and ontological data has given a framework for scientists to work within to quickly get up to speed in new areas of investigation. Unfortunately individual labs have not adopted the group frameworks for either process or naming conventions. This has lead to lack of reproducibility and loss of opportunity to generate new and unique information.

Business parallel No. 5
The balance of individuality and collaboration

This lack of shared framework across multiple laboratories is becoming a real problem for both Pharma and academia (and everyone else). The lack of system has led to reams of lost data and the nuggets of insight that could provide real solutions to clinical problems. This also leads to duplication of effort and missed opportunities for revenue(grant) generation.

We can learn from the past

The answers are only interesting when view through the perspective of your industry; what problem should I solve to be better than my competitors.

From a healthcare perspective, if we knew more about what “failed drugs” targeted, what gene patterns they changed and what cell types they had been tested on we could very quickly build a database. How many failed drugs could be of use on rare diseases-a fast growing category of need? We will never know.

This is a situation where scientists can learn from the business community for the technical tools to really allow long term shareable frameworks. These technical controls are available at any price point. Conversely the frameworks and logic that scientists use to classify pieces of content to link them have lessons for any knowledge worker i.e. taxonomy, metadata. It is hard work but it doesn’t have to be time consuming. (see here for advice)

Its time for some open-mindedness on both sides, the needs for all kinds of organizations and workers are converging-too much data, too many types of data, not enough analysis. It is not enough to be able to judge your internal problems, the CIO must solve a business competitiveness problem to get approval.

Info-ArchHow do we do that? We move past the idea that we need to plan discrete systems and move to a plan for holistic information architecture planning that includes fileshares, email, IM, any system or storage point that contains information. This requires a information governance plan-built to solve your problem not encumber you with paperwork. (see here for a practical framework)

Stay tuned to the next blog for tips on how to move from managing storage locations to building a information architecture.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter
Mike Battista
Mike Battista
Senior Consulting Analyst
Info-Tech Research Group

Successfully implement VDI, and migrate some or all of your workforce to remote access desktops, allowing productivity anytime, anywhere.

Please join me and a panel of subject matter experts on Thursday July 10, 2014 at 4 p.m. EDT for a Webinar on “Implement Virtual Desktop Infrastructure: Do you know how to successfully roll out a VDI solution?”

Register Here for “Implement Virtual Desktop Infrastructure: Do you know how to successfully roll out a VDI solution?”
(Video replay will be available at this link following the Webinar)

During this webinar we will discuss:

  • VDI is a key step in the larger move toward post-PC end-user computing, by enabling desktop access from any device.
  • VDI requires a large investment in company resources and it must be done right the first time if that investment is to pay off.
  • Focus on people, process, and technology best practices that avoid major pitfalls when implementing VDI.

Info-Tech Research Group webinars occur during the early weeks of our research projects. Attendees will weigh-in on several key polls and will be able to pose questions to the group. We want to work closely with our members and potential members as we build out our research to ensure we are thoroughly meeting your needs.

Share on FacebookShare on Google+Share on LinkedInTweet about this on Twitter