The software supply chain - keeping it linked

Micro Focus Pty Ltd

By Chris Livesey, Chief Marketing Officer, Micro Focus
Thursday, 18 September, 2014


The software supply chain - keeping it linked

Complexity is threatening to derail the software supply chain. The frantic pace of change, driven by mobile, cloud and the rise of the consumer, is forcing organisations into more complicated ways of working, with traditional IT departments outsourcing skills in order to meet delivery times their customers need.

According to an independent global research study undertaken by Vanson Bourne and commissioned by Micro Focus, an average of one-third of CIOs across Australia and New Zealand are choosing to specifically outsource application development and testing. Any sense of ownership businesses might have once felt is being eroded by the influx of personal devices and the growing number of projects originating outside the IT department.

For development managers there’s a double hit - not only are they responsible for delivering the consumer applications their customers need, but they must do so increasingly with loosely coupled teams that are, themselves, mobile and set on using their own technology and methods.

While none of this is particularly new, the scale and pace of these ‘disruptive technologies and innovations’ is definitely unprecedented. To maintain an effective supply chain and meet customer demand in the face of such diversity and complexity, a new approach is imminent.

Successful software teams are going back to basics, reviewing hard lessons and focusing their efforts on creating a delivery ecosystem built on three key elements: precision, validation and control.

Software delivery in the 21st century

Recent studies suggest that software development projects are delivered right the first time on fewer than 40% of occasions. Gartner places the number of projects completed on time at 55.8% and on-budget at 67.5% respectively in 2013, but still shows considerable room for improvement - especially when considering that these figures are averages and therefore hide some spectacular failures.

Year on year, even where methods have been modernised, we still see projects failing, through:

  • Lack of on-going precision: Inconsistencies between what was requested and what was delivered, creating delays and tension between the business stakeholders and IT department.
  • Limited solution validation: Testing for the absence of defects rather than comparing tested output against the business specification, leading to considerable rework and yet more dissatisfaction.
  • Insufficient control: Where change and control management issues are missing deadlines, failing to deliver on customer commitments and protect against reputational risk.

Whether development teams are operating within the inflexibility of a large-scale waterfall or inside a potentially chaotic agile set-up, organisations will continue to languish in the same low levels of inefficiency that have been prevalent for the last decade, should they fail to embrace the appropriate underlying technology to support each team’s preference for work.

Precision

The single biggest influence on project inefficiency has always been requirements. The tremendous rate of change in technology means that in many ways, it is now even harder for development teams to capture and deliver exactly what the customers want.

Providing an effective technology platform for ongoing engagement with all business and customer stakeholders is a fundamental building block.

However, successful teams navigating today’s software supply chain must capture requirements from all platforms - mobile, social and whether they are visual or text, instant message, email, video or voicemail. Only then can all relevant requirements be gathered and continually refined, while also being able to communicate technical issues to be resolved in a timely manner.

Test cases can then be kept in sync throughout the life cycle of work to ensure that what is actually being tested is what the customer actually expects to see, avoiding ‘scope creep’, frustration and endless delays.

Validation

Complexity in the software supply chain presents many challenges, and nowhere is this more significant than in the software testing team. Balancing the need for rapid time to market, while still ensuring that quality software is being released, is a delicate task.

The evolution of software delivery means that quality assurance can no longer function as a standalone gatekeeper in classic waterfall style, hunting bugs and sending defective software back to Development with a ‘must try harder’ stamp on it.

Validation is multilayered, and this understanding is so often lost when software testing begins. Effective validation ensures that equivalent test cases are not only created but continue to be linked to their requirements throughout the supply chain process, changing in harmony as each new expression of need is captured.

While agile development is bringing the testing function closer to the development teams - suggesting that these close and continuous linkages are easier to achieve - the rise of outsourcing, at various points along the supply chain, means that effective collaboration can actually be harder to achieve, unless appropriate technology is deployed to facilitate agreement of what success looks like in the eyes of the customer, who is the originator of the business need.

Agile development, with its emphasis on working software and regular testing, needs test automation. Not only does automated testing help catch defects earlier in the cycle, it essentially builds quality into the process.

Through this, even the most highly distributed of supply chains is able to ensure that validation is not only requirements-based at all appropriate levels, but is measurable and highly effective.

Control

It is widely held that IT will only achieve significant improvements in efficiency and customer satisfaction when its projects are based around agile development. This is simply recognition of the fact that change is occurring too fast for lengthy requirement gathering and development cycles to cope.

While agile development arose as a result of the small team refining and release working software in short bursts, distributing those teams across the enterprise creates challenges, and introducing external teams creates even more complexity.

The answer is to implement underlying technology to capture change across any of the relevant artefacts, be they requirements, discussions, user stories, code, release packages or any number of custom components specific to the project.

Achieving this without the enforcement of process or tooling at the practitioner level is a key part of effective control, empowering users without disrupting management visibility.

Loosely coupled control of the software supply chain is the only way forward. Regardless of whether organisations are working with third parties or have a distributed team in-house, the need for remote workers to interact with each other and with a centralised management function is fundamental.

In turn, the working relationship is eased, encouraging collaboration that is visible and continuous, not antagonistic, sporadic and veiled in the ‘black box’ secrecy that so easily plagues outsourced engagements.

Conclusion

Complexity has become an inevitable part of software management and delivery. With software release timescales decreasing all the time, development teams are outsourcing all aspects of the software development life cycle in an effort to keep up, complicating the supply chain and yet still failing to achieve acceptable levels of effectiveness.

A new perspective is required. Tooling for better precision, validation and control is needed to accommodate supply chain thinking and to facilitate the shift to collaborative partnerships instead of today’s adversarial engagements. This approach is the only way to raise the game of successful software delivery in the future.

Image courtesy Kristy under CC

Related Articles

Is the Australian tech skills gap a myth?

As Australia navigates this shift towards a skills-based economy, addressing the learning gap...

How 'pre-mortem' analysis can support successful IT deployments

As IT projects become more complex, the adoption of pre-mortem analysis should be a standard...

The key to navigating the data privacy dilemma

Feeding personal and sensitive consumer data into AI models presents a privacy challenge.


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd