Offcanvas

Archive for the ‘Blog’ is_category>

Escaping the Document Apocalypse

Thursday, September 1st, 2022

Quoc Pham, Product Lead at VibeIQ, shares his thoughts on a new generation of documents for true end-to-end digital transformation. VibeIQ recently launched a suite of collaboration centric merchandising and product visualization apps.


PLM was supposed to be the “single source of truth” that would eliminate the need for spreadsheets. However, when looking behind the curtains, a plethora of excel tools, Powerpoint presentations and other offline documents are still widely being used within organizations, small and large. In fact, one can argue that spreadsheets and presentations are the primary source of product and assortment data, powering a variety of arcane processes across the enterprise. So, what happened? Why can’t we once and for all get rid of the dreaded spreadsheets? The answer is simple: people love documents.

Our Love/Hate Relationship with Documents

Despite the many attempts to eradicate them, people have always relied on documents because they provide them with unmatched freedom in the way they manage their work. Within a spreadsheet or Powerpoint presentation, users have the power to mold and format unstructured information into pretty much anything. This makes documents the default solution for an infinite amount of use cases: from planning, presentation to reporting and everything in between.

A document also provides a safe personal space to experiment without any repercussions to the outside world. Users are in complete control of the information until they decide they are ready to package and broadcast their work, often using emails as the primary means of communication. This archaic form of collaboration has always been familiar, manageable and universal.

PLM on the other hand – like many enterprise applications – is structured and rigid with little control over how information is propagated. With complicated, form-based user interfaces and often rudimentary collaboration capabilities, today’s PLM will never match the ease of use and familiarity of documents. This makes PLM more of a data repository that captures the result of a process rather than the tool that supports the process itself.

While users love them, the drawbacks of document-based processes are plentiful: version control nightmare, lack of visibility and manual rework are some of the most commonly cited pitfalls. In layman’s terms, documents create friction and the more documents, the more friction.

Who’s Left Behind?

The inefficiencies of working with documents affects many functions but if we were to single out a role where the pain points are especially problematic, it would be the merchandisers. Their day to day tasks rely on a constant mix of planning, analysis and product visualization – what they often refer to as a blend of “art and science”. Sadly, those types of capabilities are chronically lacking from most current PLM solutions. As a result, merchandisers typically rely on spreadsheets to plan and iterate on their assortments while using Powerpoints to visualize, present and collect feedback on them.

Consider what happens during a typical milestone meeting: a merchandiser exports a linesheet from PLM and creates a Powerpoint presentation by copying and pasting product images and metadata from their excel export. The resulting document is distributed to multiple stakeholders so they can capture their notes during the actual meeting. A subsequent meeting is then scheduled to collectively review and discuss each other’s notes. A poor soul must then consolidate all the feedback into a separate spreadsheet before having to type that same information back to PLM. This convoluted workflow almost comically illustrates the level of friction that still exists around PLM and why merchandisers have never adopted it as their tool of choice.

The increasing complexity of how brands go to market means merchandisers are under pressure to create and maintain more documents than ever before. They must constantly keep up with new specialized channels, each with their own set of stakeholders and assortment planning and presentation needs.

For retailers that are adopting 3D design, the situation gets even more dire. Documents have always been adequate at handling simple data types such as text and images but struggle when it comes to 3D models due to their large size and the heterogeneity of standards and formats. Digital product creation has undoubtedly introduced another layer of friction that no software vendor seems to have been able to solve.

Over time, merchandisers, like many other users, had been fine with the status quo, as documents helped them circumvent the rigidity and structure of PLM. This all changed in 2020.

The Covid Effect

Even though documents have been holding us back for years, it all came to a breaking point during the pandemic. Suddenly, teams no longer were able to travel and meet in person for milestone reviews, brand conferences and sell-in meetings. Our dependency on documents increased, as teams were forced to work 100% digitally. Remote collaboration capabilities became the number one priority, leading many retailers to experiment with new tools and radical new ways of working as a survival measure.

Logically, this led many organizations to look at cloud solutions as an alternative or complement to their existing workflows. The idea of collaborating through cloud based documents was nothing new as many organizations had already been transitioning to online versions of their productivity suites using Office365, Sharepoint and Google Docs. Covid only accelerated their use and expansion within the enterprise.

More importantly, the advent of remote work culture demanded a new breed of documents with collaboration at their core. Digital whiteboards emerged as a standard for virtual collaboration with tools like Miro and Bluescape redefining how teams engage in real time. Database driven applications like Monday and Airtable elevated documents into fully fledged project management and collaboration tools. Finally, video conferencing and communication platforms such as Zoom and Slack became household names overnight.

This new era of collaboration brought many benefits to the table and helped organizations accelerate their digital transformation. However, these new types of documents quickly reached a major limitation: they remain largely disconnected and unaware of product and enterprise data, making them inadequate to support mission critical workflows.

This creates a situation where most product and assortment data is now scattered across a growing number of documents, both offline and on the cloud, as well as enterprise systems that still need to be manually updated. This exponential proliferation of documents and the lack of interoperability between the various solutions is creating a level of digital friction never seen before. This is the document apocalypse.

The Best of Both Worlds

With the recent adoption of modern collaboration technology, it is clear that users’ expectations have changed. More than ever, our PLM solutions feel outdated and ripe for innovation. This opens up a glaring opportunity for the convergence between enterprise applications and modern collaborative documents. This new hybrid paradigm would bring together the structure and robustness of PLM with the ease of use and familiarity of spreadsheets, presentations and digital whiteboards. When used together, these new types of connected documents will solve an endless amount of business use cases.

In order to succeed, these emerging tools must have the following imperatives:

  • Modern user experience: They should feel like the documents users are familiar with, without a steep learning curve.
  • Cloud based: They should be fully cloud-based and easily shareable to anyone inside and outside the organization.
  • Real time multi-user collaboration: They should allow multiple users to make edits, comments and change suggestions simultaneously with the option to easily report on changes.
  • Freedom to ideate: Like their offline ancestors, these documents must give users the control and ownership needed to experiment and “play” with information without having changes immediately impact other users of the system.
  • Product and data associativity: They should be able to understand and interact seamlessly with products, assortments and other business entities through open APIs. Product data should always be dynamic and up to date against a central data store.
  • Support for multiple product visualization: These tools must be able to support a variety of assets, 2D and 3D coming from multiple sources. 3D should always be an option for visualization, wherever products are involved.
  • Analytics capture: Unlike today’s disconnected documents, it should be possible to capture rich analytics & feedback about products and then report on that data in a centralized way.

Unless most if not all of these imperatives are met, users will most likely resort back to the form of document they trust, regardless of the level of friction involved. If adopted, on the other hand, this new generation of documents, along with the supporting platform architecture, will supersede PLM to become the foundation for true end-to-end digital transformation.

With a new pandemic-hardened generation entering the workforce, the expectations will continue to increase. Brands that will be able to leverage these emerging types of smart documents to support their core processes will unleash a new level of business value and productivity across their workforce. More importantly, they will significantly increase employee retention and overall happiness, which has been forsaken for far too long.

Posted in Blog

The End of Monolithic PLM?

Friday, July 1st, 2022

In recent years, we’ve witnessed major technology innovations in the area of digital commerce with a shift from traditional monolithic, server-based architectures towards cloud native solutions and, more recently, “headless” API first platforms. What seems to be the main driver behind these innovations is the need for increased flexibility and agility in how brands must deliver innovative customer experiences in a fast-changing digital retail environment. While the adoption of these consumer facing technologies has accelerated in recent years, the “pre-commerce” technology stack has seen little innovation. In that space, monolithic PLM still reigns as the dominant architecture within the RFA industry.

So, we must ask ourselves, will monolithic PLM solutions go through a similar architectural evolution and, more fundamentally, are they still fit for purpose in today’s quickly evolving go-to-market process landscape? To answer these questions and understand where we are today, we must first look back at a quick history of monolithic PLM.

The legacy of monolithic PLM

Monolithic PLM solutions emerged in the early 2000s amidst the advent and democratization of web-oriented technologies and programming languages. These made it possible to develop sophisticated web server-based applications with domain specific data models and rich user interfaces that could be accessed from anywhere with a browser. This was seen as revolutionary in comparison to AS400 era homegrown applications with often rudimentary user interfaces which had to run on terminal clients within a company’s internal network infrastructure.

This led to the genesis of a first generation of retail-specific PLM solutions, ushering the RFA industry into its first major wave of digital transformation. Around that time, many retailers had outgrown their internally developed solutions or were attempting to systemize their inefficient document-based processes. Monolithic PLMs offered a compelling value proposition: the ability to centralize all product development data and processes under one all-encompassing, robust enterprise application. The “single source of truth” motto became the key selling point and business driver behind most PLM implementation initiatives.

Over the next two decades, monolithic PLM solutions matured, significantly increasing in scope and complexity in response to an ever-growing list of requirements from a diverse set of retailers attempting to achieve the elusive “single source of truth”. What started essentially as a tool to manage CADs and tech packs turned into an all-in-one solution to support a massive process landscape, ranging from managing component libraries to quality assurance and custom compliance, with everything in between. This requirement proliferation spurred a rat race amongst PLM vendors, each jockeying to capture market share by bloating their solutions with new features, many of them more suitable for flashy demos than real life use. In a way, monolithic solutions became sort of ‘jacks of all trades’ and ‘masters of none’.

The increasing size and complexity of monolithic PLMs made their implementation and maintenance difficult and costly, resulting in long innovation cycles. In monolithic systems, the components of the application are tightly coupled, creating a high level of interdependencies between the front end, back end, database layer and other 3rd party components. This means that anytime a change is made to the application or any of its internal components, the entire system needs to be re-tested and deployed all at once. This typically results in infrequent deployments which make it difficult to do true agile development. This also means that individual parts of a monolithic application cannot be replaced without major refactoring, preventing software vendors from leveraging the latest and greatest in web technologies, notably on the front end side. In short, monolithic PLMs are slow, expensive and becoming outdated.

These problems were exacerbated by another flaw: monolithic PLMs tend to be extremely complex on the inside but expose very little in the way of APIs to the outside world. Integrations are often limited and typically require custom development and the use of middlewares. The lack of native connectivity also prevents retailers from fully leveraging emerging capabilities such as machine learning and artificial intelligence which rely on a constant stream of information. By nature, monolithic PLMs are silos of product development data, limiting their value as part of a modern connected enterprise.

Despite the challenges, monolithic PLMs served their intended purpose over the years and, in many ways, provided a much needed sense of structure and control across mission critical product development processes. They were the perfect solution to the original problem: eliminate process redundancy and technology fragmentation within increasingly complex product development organizations.

An inconvenient “source of truth”

Fast forward 20 years, and the technology foundation that underpins monolithic PLMs has seen very little evolution. The RFA industry, however, has. In that timespan, we have witnessed dramatic changes in how brands go to market: from the rise of digital channels and fast fashion to a shift towards private label design and vendor-driven development. This meant retailers had to constantly re-align their processes to adapt to these new complex business models. In this climate of change, agility and resilience became the most important qualities to any product development organization.

Alas, the “single source of truth” mantra that was once heralded as a key PLM benefit slowly started to become a limitation. The structure and control that came with it arguably hinder organizations’ ability to be agile. Aligning an entire business on a rigid workflow-driven process is no longer a priority. Instead, retailers need the ability to quickly pivot their operations and ways of working to support channel specific processes. Monolithic PLMs were never very well suited to support omnichannel in the first place. After all, they originated in the era of bricks and mortar and rigid seasonal deliveries.

PLM’s rigidity became especially problematic in the area of creative design and merchandising in which processes rely heavily on experimentation and visual collaboration. Ask any designer their opinion of PLM and you’ll probably get the unanimous reply that PLM “stifles” their creativity. To merchandisers, PLM lacks fluidity, shareability and assortment visualization capabilities. As a result, they must resort to spreadsheets, powerpoints and offline communication to do their job.

This highlights the uncomfortable truth with monolithic PLMs: to many users, they are often seen as the rigid “system of record”, where data is captured but never where teams collaborate, ideate and make decisions.

Towards a new paradigm: the GTM Platform

The future of monolithic PLM is uncertain, however there’s one thing we know for sure: technologies that will enable retailers to unlock a greater level of agility and collaboration within their go-to-market processes will prevail. This points logically to cloud-native platform architectures which have already revolutionized the digital commerce space. These platforms should not only cover the scope of PLM but also have the opportunity to include a broader set of stakeholders and processes to cover the entire go-to-market (GTM) landscape. Let’s call them GTM platforms.

The main characteristics of GTM platforms is that they will be designed as API first microservices that will allow for flexibility and interoperability between back-end logic and end user applications. These microservices will enable a variety of specialized apps to interact with business entities – such as products, assortments and component libraries – independently from each other. This architecture will make it possible to develop a new breed of lightweight front-end applications with world class user experience and collaboration capabilities.

No longer will organizations be tied to a single technology stack, therefore reducing vendor lock-in. Rather, they will be able to mix and match and orchestrate various capabilities to support diverse omni-channel and category-specific processes. In theory, a brand could use a combination of apps from multiple software vendors, internally developed, as long as these can communicate through APIs within a GTM platform.

GTM platforms will enable a much greater level of flexibility in how brands design their future technology stack(s). They will be the technology enabler that spur a new era of innovation in the “pre-commerce” space. More importantly, they will empower organizations to achieve true agile go-to-market.

Conclusion

One of the silver linings of the pandemic is that it helped expose the gaps and limitations in the tools and processes that have been the status quo over the past decades. It also helped shed a light on the unsustainable nature with which our industry has been developing products and going to market with them. To many retailers, the experience of having to work remotely over the past 18 months has highlighted an undeniable fact: digital ways of working are the norm. Organizations that will be able to achieve agility by harnessing new innovations in platform architecture and collaboration tools will undoubtedly gain competitive advantage. For the others, they will have to stick to paper cut dolls, foam core boards and physical showrooms which already seem like artifacts of a bygone era.

Posted in Blog