Sunday, December 01, 2013

Product Manager definition: A translator?

About 18 months ago I took a pretty big side step in my professional career, approached Dominique Leblond and told him I wanted to join his team as a Product Manager. 6 months later I left Professional Services - where pretty much all my career was built - and moved away from a comfortable position as Principal Consultant for SDL US to that hotbed of discussion, politics and Priority Management: Product Management.

One of my dearest friends told me: "I hope you make me change my opinion of PMs. Every single one I know sucks".

To me it was a simple decision: move from a position where I implement the product - often working around design limitations and customer's lack of vision - to a position where I can change the product to more closely match (current day) WCM customers. It is not a secret that WCM has changed immensely in the past 10 years, and to continue to stay ahead of the curve we need to (like everyone else) start worrying about what happens after you click the publish button. (Disclaimer: I am in no way responsible for SDL's move into this space, these are processes and projects that can take years to completion and were already in progress before I joined Product Management - probably reason number 1 of why I joined PM was because I agreed with the vision).

What wasn't so clear is exactly what does a Product Manager do? I obviously saw and was inspired by the performances of my fellow co-PMs (Davina, Alexandra and of course Dominique) whenever they had a chance to talk to us - on new product features, on new launches, on roadmap planning, but it wasn't really clear to me what goes on behind the scenes that makes the clock tick. A roadmap or a product launch is not something that happens on a vacuum, born out of pure boredom.

So, in the past 14 months I've been learning the ropes of what it takes to be a good Product Manager. I took to the web for inspiration, and I've learned that if you're good, you're the number 1 position your company can live without... at least for the first year or so. Do go read Kenneth Norton's take on it, a very good read.

And then, yesterday, a thought hit me on the head about what probably defines my role in the best way: I am a translator. I spend my days translating vision into high-level, flashy, sales-ready brochures and presentations, translating vision into low-level, very un-flashy, development ready epics and themes, translating from high-level, blurry customer requirements to development themes, from low-level, incredibly detailed and narrow-focused requirements from the implementer community into higher-level, theme-linked approaches.

Obviously, interpreting priorities is the most challenging part of my job. Understanding that there are 20 things our teams could be doing, but only 5 will be done by the next release is easy. Deciding which 5 is very hard. And I have to do that by translating the needs of our customers (internal or external), the longer term goals of the company, the short term goals of the company (including sales - sales goals are always short term, no matter which company you work for) and doing what's right.

So... let's take a look at an example of how this translation process goes, shall we?

  • Roadmap states "Mobile Experience Management" as a theme.
  • Translate up: Provide clarity to Product Marketing and Sales teams as to what components are modified (and how) to enable "MEM" (because we need a new acronym, CEM, CXM, WCM, PEM and such others are not enough :-)).
    • This will take the form of high-level briefings and presentations on how Mobile Experience Management will part of day-to-day work of both developers and content editors by using modules such as the SDL Tridion Context Engine and Experience Manager's Device Preview (it is awesome by the way)
  • Translate down: Provide clarity to developers on how to group devices together, use Ambient Data to track device information, understand if we are currently in "Device Preview" mode rather than a real device (and take corrective measures, for instance, on how we determine which browser you're using)
    • This will take the form of low-level use cases and functional requirements that can be further translated to "real" development actions.
  • Translate left: Provide information to implementers on how to use this new feature called "MEM" to their benefit, usually by making sure it is all correctly documented.
  • Translate right: Inform existing customers and prospects on how MEM will make their life so much easier, they'll forget they ever had a challenge with mobile.
And this is how I spend most of my time: translating. Looking at the same topic from 4 different views, describing it in 4 different ways, using different language, using different techniques and tools, and trying to bridge expectations across all 4 "channels".

A similar process is done with translations going the other way around, where a requested functionality may end up becoming a theme by itself and land on the roadmap, and eventually the product (recent examples include our completely revamped workflow engine and bundles - both introduced in SDL Tridion 2013).

If it sounds boring to you, well, maybe you're just not the personality type that would like Product Management. I do spend a lot of time discussing themes and future developments - not only of the product, but of the web as a whole, with a focus on how Content Management must evolve. But at least half of my time is spent translating. And it's great, often I end up knowing a lot more about a feature I designed myself :-)

Friday, November 22, 2013

Playing with the future - Part 5 - Any tool can create content


Today, we all are Information Architects. The average number of documents, presentations, emails, blogposts, and the myriad information sources we have to cope with daily continues to grow exponentially, with no end on sight.

So we all come up with nice little tricks to organize our content. Some go for the "all in one folder approach" (works with good search), others go for the super-structured approach for content management (folders upon folders of content hierarchies), and others (if your company's smart enough about knowledge management) go for the Enterprise Search approach. "Dump it in any of our document repositories, and go to this url to find it back".

Coveo nod: I really like their tagline of "Stop searching and start finding".



So, as part of our daily job as information architects (for our own information, not for your organization's), we work with a lot of tools, and very often the tool you use is determined not by your preferences but by the intended audience of the content you're creating:

  • Microsoft Word for the audit report
  • Microsoft PowerPoint for the roadmap or visionary statement
  • Blogger/wordpress for your personal blogpost
  • Email for the quick communication
  • Twitter for the even quicker communication
  • Tridion / CQ / SiteCore / Sharepoint for your company's official blog
  • Visual Studio or Eclipse for the really cool stuff
  • OneNote (or EverNote or Google Keep) to take notes during meetings
  • Prezi for the "I'm cool" effect (nope, doesn't work that way anymore, you're 3 years late)
  • Confluence for requirement gathering and roadmap grooming
  • Jira for backlog management
  • Facebook for the family/friend hugs
  • [list goes on]
Many years ago I remember thinking that, perhaps, the browser would be the tool of the future. Nope, that didn't really work either - yes, you do use your browser to do a lot of your work today, but you're not really using a browser - you're using the application behind the browser.

And another thing that is happening is that we're losing the W in WCM. Content that is not web accessible is not really content anymore, is it?

Hence my prediction... tools that can handle content transformation easily and can abstract the delivery mechanism are the tools we're going to use for everything. CM will eventually become a standard set of APIs (yeah, yeah, CMIS is an effort in that direction... but not really there yet, and too enterprise-y) and the tool you use to create content won't matter anymore. Because there will be enough intelligence behind the tool to "understand" what you're talking about (see part 1 and part 4 of this series) there will also be enough intelligence to understand how to transform that content to your required delivery format. And the tool(s) of the future will be born to address this requirement - hide all information architecture complexity from me, let me create content as content, and then help me deliver the content to my audience. And don't make me think.

Saturday, November 16, 2013

Playing with the future - Part 4 - Data is the future of content

A few months ago I had a question from a prospect that had me stumped.

Your product is great with content, but how does it deal with data?
It took me a while to understand - more context was needed - and still today it is somewhat haunting me. To me, content is data, so what in the name of <insert random deity> did they really mean?

Well, what they meant is not exactly what this post is about, but it is somewhat linked. In their specific scenario Data meant semi-structured data feeds they get from their other applications that may or may not be displayed on the website.

How to deal with it is linked to the topic of this post.

You've seen the semantic web at play. For us WCM geeks, some of the first examples of the semantic web were those "helpful links" under a google search result (and now they show with even more detail, like a link to a related blog) and lately with efforts like good relations and schema.org the semantic web keeps on creeping up on us with great results for everyone (and search engines!).

With me so far? Semantic Web is good, content is good, data is also content, but might come from a different source.

Now, why do I say that data is the future of content?

If you've developed websites before (not just snazzy html, I mean really design websites, web experiences, content creation flows, contextual experience definitions, etc) you've probably been frustrated like I have about the lack of metadata on content. Editors seem to just want to use "an html WYSIWIG" editors to create content, but then expect you to do miracles about how the content displays, magically determine which pieces of data are used for your tab names, which images to use for the home page, etc. I had a particularly harrowing experience with a given editor that insisted all content should be classified under "Personal Finance", yet expected the site to be able to sort the difference between auto loans and mortgages (deep sigh).

Summary #2:
  1. Content Editors want an easier to use, easier to create content in, simpler UX/UI paradigm that allows them to create semantically meaningful content without having to deal with complex operations or data structures.
  2. Semantic content needs proper annotations, schema compliance, and contextual information (used to determine, for instance, when is appropriate to show this content vs when it's not)
  3. Both points above are at odds. To create semantically meaningful content editors need to spend more time curating their content.
And this means that the solution is to enrich content with semantically meaningful metadata automatically (with the possibility to be modified/enriched by the editors).

In other words - editors will get what they want: simpler ways to create content; and developers will get what they want: more meaning attached to their content in the form of "data" - metadata, ambient data, structured streams, whatever you want to call it. And that will allow us to start creating smarter UIs that can help you determine layout, presentation and context with less effort. All you need is more data, and we will get more data from intelligent systems that can do most of the digging for us.

Monday, November 11, 2013

Playing with the future - Part 3 - Context Engines

As a starting point of our 3rd post in this series of "non-binding futuristic plays", I'll start by telling you a secret that everyone except Marketing seems to have realized. Actionable analytics already exist, and they've been around for quite a while.

Yes, it's true. I keep hearing some babble babble from Marketing people on how they need actionable analytics, and we (IT guys with a clue) keep on asking them "how do you want to use them?".

You see, we have the data. We have had the data for many years now. The challenge is not there. The real question you (Marketing guys with a clue) should be asking is "how do I use the data we have without having to call you every time I need to change something?" So, basically, what you need is not actionable analytics. What you need is a way for you to act on the data, and a way to know which data you have, that doesn't involve calling me or some other IT fellow unfortunate enough to be on your quick-dial list.

OK, guess that's enough to set the stage for what I want to talk about today: Context Engines. Back in August we released (rather silently, I admit, and for good reason) version 1.0 of the SDL Context Engine, and we're now finalizing version 1.1 (to be part of SDL Tridion 2013 SP1 and available for 2011 SP1 as an add-on, if that's what you were going to ask) and I am really impressed with what we were able to cook so quickly.



What does a context engine do?

I believe that modern sites should be able to answer, within milliseconds, a very simple question: why did I come here? Understanding the reasons that drive someone to open up a given URL gives us the insight required to serve that visitor's request quickly and without wasting their time (i.e., providing a good web experience). And there's no other way to understand why you're here than by understanding the context that made you come here.

So, what is context?

That's a very open ended question, so I'll answer it in 2 ways:

  1. Context is everything
  2. Context is a collection of data points that can be used programatically to determine why you visited a web page, and let you act upon this based on configurable rules.
 A Context Engine does the following two things:
  1. Determines the properties of the current context
  2. Evaluates the context and executes a certain contextual path or rule
Example:

In the current context we determine that you are using an iPad 2, it's 10 in the morning, it's the second time you came to our site today, and the last product you looked at was coffee.
Based on this information we can:
  • Make sure you see the tablet optimized UI for our site (server-side, with optimized images, not only RWD)
  • Give you a coupon for the nearest starbucks
The beauty of this is not that it can be done. I (and most other Web people out there) could have written code for this back in 1999 (well, not really the tablet-optimized UI part). The beauty of it is that this rule was created by a content editor, using perhaps something as simple to use as SDL Customer Analytics (or who knows, Tridion Target Groups) and the Context Engine simply chose the most appropriate path based on your rules.

Now if you extend the data awareness of a context engine to include data from your purchasing history (or interaction-with-my-brand) you suddenly open the door to way more ways to provide a contextual experience to any visitor, and you start being really good at understanding why I came to your site, and, who knows, maybe you'll even be able to sell me that great vacation I clearly need.

This is - again - not new. SDL Fredhopper, for instance, is an amazing Context Engine. What I think will be new by 2020 is that most sites will be using a Context Engine or similar technology to determine the context and decide what your experience should be. I also expect to start seeing cloud-based Context Engines (someone called them Context Brokers in the past) with all the serious privacy implications this includes...

One last note. As part of the development of the mobile aspect of Context, I've come to realize that most people ignore the fact that the device you are using is only a part of the context, not all of it. The WCM industry seems to be focused so much on how to show nice buttons on an iPhone that we seem to forget the bigger picture: why are you using an iPhone to come to this site? Are you on the move? Are you having a smoke outside? Are you right outside my shop? Are you in my competitor's shop doing price comparison?

If experience was determined by UI alone, then nobody would ever use craigslist. No, well-designed Context Engines put editors in control of selecting the right content for the right context.

Tuesday, November 05, 2013

Playing with the future - Part 2 - Content Ownership

As a follow up to my previous blogpost, here's the second concept we came up with on the topic of "How will content authors create content in 2020".

This idea might be a bit more radical than the first one... "Content ownership will be diluted".

There are many types of content creators out there, from the marketing-snazzy cloud-sourcing heavy world of "modern social media buzzword compliant marketing" to the corporate workflow-heavy legal review world of most of the customers I work with.

In some industries, it is perfectly acceptable to have someone from outside the organization to create content for you - be it via "endorsed blogging", or "fan content on Facebook", or even comments on specific pages that get promoted to full page articles given its quality. This is something we already see happening today on a regular basis.



But the brand fan of the future is different. The brand defender of the future is possibly 16 years old, and is compelled to share due to sharing being in their DNA - hyper-connectivity does that to you. So companies - including workflow-heavy legal compliance companies - will go out of their way and find methods to assess how much of a fan are you really, and possibly give you special rights to create content on their websites. 
If you believe in my company and brand even more than I do myself, why would I stop you from contributing positively?
Here's how I think this will impact the world of content:
  1. Gamification principles and social media tracking will be used to accurately measure a person's brand-awareness level - you want to find those brand defenders out there, and you want to empower them
  2. Brand defenders will - from outside your firewall - have special privileges on your content platform - be it by being allowed to review content, or by being able to create content themselves. This process already happens today, but in a rather unstructured way. (I certainly get emails from brand defenders about content published to Tridion World, I can only imagine that Bart Koopman gets even more)
  3. Brand defenders will be given access to marketing strategies, campaign ideas, and any other branding material. They will carry the flag for you in exchange for early access to data, exclusive T-Shirts and bragging rights. Why wouldn't you reward them in their own coin (data)?
In other words, brand defenders will become your "trusted content contributors".

I can certainly see a future where even the most legalese of texts gets reviewed by people that are -- at first glance -- completely unrelated to your company, but that know your brand value better than the people being paid to create a brand value. Where content is created for your website by your most loyal fans, and where content management tools are built with this in mind from the ground up. Where content review is done by people outside your corporate legal department (but likely not excluding legal completely), and where you provide your brand defenders with all the tools and data they need to be heard.

Tune in soon for my next non-binding futuristic play: Context Engines.

Friday, November 01, 2013

Playing with the future - Part 1

Some time ago I had an interesting conversation by email with my colleague and fellow Product Manager, Alexandra Popova. The subject was "How will content authors create content in 2020".

This spawned a whole series of ideas and concepts about content creation and - especially - content delivery, ensuring that the content that is shown is what you are searching for at this point in time. From there I ended up creating a Slide Deck I use sometimes with the title of "The future of content - a non-binding futuristic play". I think it's time to put those ideas out to the world and see if there's any strong disagreements.

There are 5 main "ideas" that we think will be prevalent in 2020:
  • Schemas will disappear (as in, you won't see the content structure anymore)
  • Content ownership will be diluted
  • Context Engines will be mainstream
  • Data is the future of content
  • Any tool can be used to create content for a delivery platform
I left some out for the simple reason that by 2020 they will already be a strong reality: web content will stop being page centric (some argue that this is already the case, and I agree), content will be self descriptive and "atomic".

Anyway, let's dive into the 5 ideas that I think will be a reality in 2020.

Idea #1 - Schemas will disappear (from the editor's screen)

There will still be some niche markets (product catalogs and support documentation) where this type of interface makes sense, but as systems improve and become more reliable, you'll be more & more using "smart" content editors that derive the semantic meaning of your content for you. There will still be a schema that your content must comply to, it just won't be "in your face". And no, HTML5 is not a content schema - at most it's a page schema and a vocabulary for content. You can present structured content using HTML5, but that's a result of its flexible design.

So, what are some examples of this out there? Well, our own SDL Xopus editor, for starters. It binds to an XSD just like most XML editors, but presents it in a completely familiar way to editors used to working with "less structured" content tools like MS Word or EverNote. (go play with the demos if you don't believe me)



Furthermore, the advances being done in what was once an exclusive domain of "enterprise search" software - entity discovery and concept mapping (see the excellent Apache Stanbol as an example) - means that this type of technology is not anymore in the domain of very expensive and rare software. No, it's starting to be available to anyone with a workable Internet connection. And if I, as a content editor, can let the software discover what my content is about and tag it appropriately then I'm free to create my content and let the metadata tag itself (I obviously need to review and approve).

An interesting side effect of this is that you will not get less metadata. Au contraire, we're going to (finally?) get a lot more metadata can be used to segment our content in ways that will allow our devices to display it properly.

Systems (and people) will struggle for a while, but as leading systems pick this up and start improving on it with usage, we'll all get better for it. And devices like HUDs on cars will easily display your content, in a way that won't distract the driver from what he needs to focus on.

Next week I'll post about how I feel the concept of content ownership will slowly dilute, and distributed ownership (including from outside your firewall) will be the norm.

Thursday, October 17, 2013

The clueless bin

As a nearly-40 ex-teenager (I swear that's how it feels sometimes) I spend some time wondering if "this is what it means to have experience". The little details of how you start seeing parallels on every meaning.

Like you realize how the knowledge worker in his decision factory is actually not that different from the blue-collar worker in the real factory (salaries aside). How everything is a cog in the great gears that power our individual worlds, and when boiled down to their essence everything starts looking like... well... the same thing.

In that line of thought, I started trying to pay attention to the little signs that indicate someone's been around for a while and knows what he's talking about. I think that I come across as an experienced person - because I am, see line above about being nearly-40 - but what exactly is it that gives it away?

I can read about all the psycho-freudian-jung-ish stuff about authoritative sound and positioning, but as far as I can remember I always sounded as if I knew what I was talking about, even when I had no real clue (very important skill when you're a consultant by trade), so I guess the real difference is that now I know that I do know what I'm talking about most of the time, instead of knowing that I was pulling rabbits out of my hat.

How does that transpire to others though?

In my professional life I have met and worked with thousands of IT professionals, and all of them left an impression, as is inevitable. My binary brain will default to sort these people by "has clue" and "clueless", and while sometimes you get the privilege to work with someone long enough to do a proper evaluation, most of the times you get 1-2 days at most. And yet, that clue/clueless binary decision is taken anyway.

Be aware, I am not trying to determine whether someone has or hasn't a clue - I'm trying to understand what is it about those people that makes me think they (don't) have a clue.

Some things throw me completely off:
  • Bragging - I did this, solved that - immediately makes me think of daddy issues and insecurity, a sign that this person needs others to think they have a clue. Goes into clueless bin almost immediately.
  • Asks the same thing twice. If someone asks me how to do something twice - even if it's 6 months or 3 years apart - person goes into clueless bin. Not applicable to real technical terms, as in "which property of object X gives me its width", or "do you have a sample of how to get the current device context" - these get a degree of forgiveness - but definitely applicable to conceptual descriptions, as "what is the Ambient Data Framework" or "why would I use Java for that". Perhaps I should rephrase to "asks same thing twice without realizing I already explained it before"
  • Dismissing any technical challenge as "trivial" - especially if it has to do with CSS. I have the utmost respect for good CSS hackers. Great CSS hackers are like gods to me.
  • Blaming others for their failures. When ministers resign from their governments is typically because someone way down the chain did something they were not supposed to do. Is it the minister's fault? Most likely the minister didn't even know that there was one person in their ministry that was supposed to empty thrash bins in the first place, let alone that this person could have contacts with the press and was really good at reconstructing shredded paper.
 OK, so what else can influence me on this binary decision?

  • Honesty. Being able to admit that they tried something and it failed miserably. You only truly learn with your own failures.
  • Accepting guilt, not too eagerly, but understanding that perhaps it failed because the person did not understand or listen correctly to advice given, or assumed that doing it alone was faster
  • Understanding that sometimes you are way out of your depths and asking for help.
OK, so my goal was to have a more objective list, but I guess it truly is subjective and I should go read some more Jung before I can understand it. Or, I should show the wisdom that will soon be visible in my hair and just accept that people are how they are and embrace the beautiful diversity of people I have the pleasure to work with.

Tuesday, October 01, 2013

Introducing Responsible Web Design

How do we do Mobile with SDL Tridion?

If I had a dollar for each time that I've been asked that in the past few years, well, I'd have a few dollars. As a seasoned Tridion implementer I know that the best answer to this question is "how do you want to do Mobile with SDL Tridion?" or better yet "What do you intend to achieve with the mobile channel?".

Unfortunately, most of the time the person asking this question is not someone that can answer my question - they are looking for the magic "go mobile" button that will transform anything they currently have to "mobile-ready". And - as you probably know - Tridion is just not that type of system that can transform your content without you being quite explicit about what you're trying to achieve. It will do anything - as long as you can tell it what you want to do.

This question has been particularly more frequent ever since we acquired Bemoko, a UK-based mobile-delivery focused company, and there's been some confusion in the market about what exactly we set out to do with this acquisition. Let me state right here right now that we did not have a plan to give you that "go mobile" button with it (not immediately anyway). 

Instead, our plan from the beginning has been to provide you the tools you need to create a proper mobile strategy and presentation for your content. In other words, with the features we are adding to Tridion you will be able to create that "go mobile" button, and it will do exactly what you want it to do.

So we started following that path - what do our implementers need to provide cross-channel experiences for the sites they create - and released version 1.0 of the Tridion Context Engine (officially named Context Engine Cartridge) about 2 months ago, and are now getting ready to release an update with some interesting new features. The goal of this context engine is really to start providing developers with the tools they need to create "context-aware" experiences, and therefore the focus of the first release was with providing you knowledge about the device your visitors are using to access your site, on first click, on the server side.

There are already quite a few posts from the Tridion community about how to use this cartridge, I'd recommend reading this one by Rob Stevenson Leggett and this one by our very own Eric Huiza. In a nutshell, this context engine - at time of writing - gives you properties about 3 aspects of the current device:
  • Device
  • Browser
  • OS
Examples include "IsTablet" or "IsMobile", etc. Since most of this data is "buried" inside the Ambient Data Framework, I joined up with a few community members and created an open source project dubbed "SDL Tridion Context Engine Wrapper" to expose some of this information in a more web-developer friendly way and you will now find .NET server controls, html helpers for MVC.NET and even a Personalization & Profiling extension to expose these properties as part of the visitor's profile. 

What can we do with the Tridion Context Engine?

During the recent SDL Tridion MVP Retreat in Óbidos I sat down with Rob, Mónica and Julian and we decided that we would build a set of methods to enable Responsible Web Design, a framework to provide implementers ways to use Responsive Web Design while making sure they keep their bandwidth (and load speed) as low as possible. These are some of the things that the framework can be used for:
  • Remove navigation elements that would not display on a given device
  • Remove page elements (calls to action, sidebars) that would not display on a given device
  • Resize images to match the device characteristics
  • Resize images to account for Retina capable devices (not as simple as it would immediately seem)
  • Group devices into device families, simplifying the "targeting" of these solutions

We expect to have some really cool functionality added to the Context Engine soon (like expression evaluation for audience segmentation) and will continue enhance its sister open source project as we go - and as we gain field experience with it - and will continue keeping in mind that mobile is a channel that will continue evolving at a really fast pace (cars? glasses? road signs?) and adapting as quick as we can to support what the field really needs to do a successful implementation of "the right content at the right time for the right device".


You can find the Tridion Context Engine Wrapper project on github.

Monday, September 30, 2013

Quick Summary of open source projects started during the MVP retreat 2013

The SDL Tridion MVP Retreat 2013 finished yesterday, and as usual in these retreats we start a few open-source projects with things that (we think) are interesting to the SDL Tridion community at large.
Here's a quick summary of the 4 projects we started (and shared) over the course of the last 4 days:

Overall I think this was a very successful event, with loads of information to come in the next few days about the projects mentioned above. Watch this space.


Sunday, July 07, 2013

What is the future of content?

I guess I have said it enough times - I am in a rather privileged position within the WCM space, as I get to see a lot of what really is implemented by our customer base (mostly in the highly regulated financial industry), I get to play with magic crystal balls, and I get to hear about what experts and analysts think content people should be doing now, and in the future.

And I admit, I am extremely confused, because the pieces don't fit together.

Let me try to explain what I mean with this.

On the one hand, we have simplicity. The people who select WCM systems (please note that I am not saying "Content Editors") want a "Facebook-like experience" creating content. They want the simplicity of WordPress or perhaps Blogger (which I use) when creating content.

On the other hand we have compliance and content governance. People want to control content life cycles, trace back the origins of content from its inception to its inclusion on a given page, to its transformation into a Call To Action on the home page, to its decommissioning a year later into the "outdated content" bin.

And finally, on the last hand (yes, I know those are three hands, get someone to lend you one), we want content re-purposing and targeting. We want content that can be used across all channels - Web, Email, Digital Signage, Facebook, Twitter, IOS/Android Apps, Google Glasses and what not.

And this is where I get confused.

Simplicity means one thing - less flexibility. Sure, you can make very reusable content with WordPress, but then you're putting the onus onto your editors to use the correct html mark-up for your content. I agree that a lot can be achieved with some smart plug-ins, but how do you prepare for the next channel that you don't know about yet? And especially, how do you create truly "re-formattable" content when you want simplicity, which by its own definition mixes content and layout?

And governance. Governance is impossible without metadata. Usually, it requires metadata about your metadata (ever tried defining a release policy linked to content models?). Again, sure, we can try having your editors enter that metadata in their simple-to-create-content web tool. Doesn't look so simple anymore, does it?

Creating content that can be re-purposed for various channels, including those that don't exist yet, and allows for personalization is where the whole thing falls apart in my view.
How can you have content that was simple to create - web based, WordPress like interface, loads of drag-and-drop widgets and SEO optimizers and navigation managers and whatnots - and have that content still comply to a strict schema, like those specified in schema.org? Sure, you can look at semantic engines - like Apache Stanbol - to help your editorial team get their content right, but how is that going to help you determine which headline to show on your iPhone app?


Don't take me wrong, I'm not trying to say that creating content should be hard - on the contrary, it must be easy. But creating content that can live beyond the constraints of a web page requires thinking beyond drag-and-drop and easy-to-use (actually, having to use your mouse when creating content is a terrible thing and completely breaks the flow of work) - you have to think about content modularity and new delivery models.

My view on the future of content is that layout and look-and-feel will disappear, and consumers of your content will care about nothing else than the content itself, the layout will be based on their preferences and their own metadata, with smart apps - in whatever is the device-du-jour - taking on the task of formatting that content for display. Don't think RSS, think Open Data+schema.org.

So, unless I'm terribly wrong (which is always possible and known to have happened many times), the emphasis on Customer Experience Management requires more content structure, not less. And more content structure, today, means that editors must have an abstract view of content, focus less on how it looks and more on how it's structured, so that the context in which the content is displayed can adapt to the customer's expectation. And to do that, we have to think beyond pages and re-think content as individual entities that can be manipulated and formatted for display in whatever device or app the customer may chose to consume it in. And that my friends, is against the view that content editing should be a simple, word-editing-like experience. Just like word editors were built for a world in which your main goal was to print that document once it was finished, many WCM tools out there are built to publish web pages, and web pages will be just a tiny fraction of how content is consumed in the future... (just like the number of people that actually print documents once they're finished with their editing).

Sunday, April 14, 2013

SDL Tridion and Rich Text Fields

In the past year we've been asked a few times how to add support for non-XHTML tags to Tridion Rich Text Fields (if you want to know how, check here, here or here) and this usually gets followed by a discussion with questions along the lines of:

  • Why do you want to do that?
  • Why is it not there out-of-the-box?
  • Is it supported?


So, here's my take on why you're doing it for the wrong reasons. Yes, you can modify the rich text settings to allow entering any possible tag, but everytime you do this you are breaking the experience of your content editors for the sake of making your html markup work on your site.

When editors must create content that includes non-standard XHTML tags they will have to edit the html source of their content and add the tags themselves - guess how much of that will work when using Experience Manager? When implementing Tridion (or any other WCM) you may be tempted to take shortcuts to implement faster, and also to avoid having to review the HTML/Web app that you intend to use. But if your content editors are expected to edit their source html to insert "data-rule" attributes in links, you just made their life harder.

What should you do instead?

In my view, non-XHTML attributes required on content should be dealt with in the most user-friendly way. If you need to use <article> or <section> tags in your RTF, why not convert <div class="article"> to <article> at publish time? If you need to insert data attributes to links, then do consider a similar approach replacing other, normal attributes (or, at the very least, consider writing an extension that allows to add those tags without editing the html source).
Every time you ask an editor to edit the source of their html to add attributes you are asking them to dislike the system. Consider this when designing applications, making it hard to edit content has never been one of the goals for a WCM implementation project...

Tuesday, April 02, 2013

R5 is dead. Long live R5


Today we finished the code base for SDL Tridion 2013 and released it to customer support. Though not a revolutionary release as SDL Tridion 2011 was, it brings a lot of functional enhancements to core Content Management features that our enterprise-level customers asked for, and (the best feature in my view) the ability to use non-Tridion content as "regular" Tridion assets.

(If you want to know more about what's new with SDL Tridion 2013, take a look at the community webinar recording).

A consequence of this release is that, keeping in line with our support policy, support for SDL Tridion 2009 should end in the next 6-12 months (we always support the last 2 major Tridion releases, and offer a "grace" period for the version prior to that), and this means the end of the R5 line.

SDL Tridion R5 was initially launched in October 2002 and introduced a LOT of concepts that were quite revolutionary for the WCM world at the time:
  • Personalization & Profiling via Target Groups
  • Browser-only, fully functional, business-friendly interface with Server-side XML and client side javascript (I guess I can't call it AJAX because AJAX didn't exist back then!)
  • Extensible Event System allowing for implementation-specific automation
  • XML everywhere
  • Support for XSLT templates and VBScript/JScript templates
  • Template Building Blocks!
  • Component-based Content Management!
  • Java & ASP Content Delivery modules
  • Blueprinting!
OK, some of those items above already existed in R4 or even before that, but you get the point. And today, 10+ years later, we're moving away from the brilliant architecture that was put in place to support SDL Tridion's customers through the web revolution and have entered the brave new world of WCM 2.0.

Stop for a moment and think about how much (and how many times) the web changed since October 2002. Now consider that the same core architecture that powered sites in 2002 is still powering major sites today - that's foresight. It shows the designers of the R5 core knew what they were doing, and where the world was moving to. With Tridion 2011 ("R6") we did some much-needed refactoring of the interface (browser support & extensibility being the core new features) but also quite a lot of lower-level refactoring that allowed us to introduce new products quite rapidly after that release: Experience Manager, Online Marketing Explorer, User Generated Content) and finally with Tridion 2013 ("R7") the introduction of multi-item workflow (aka "Bundles") and External Content Libraries.

In other words, we're off to a good start on the R6/R7 dynasty - may it serve us (and our partners and customers) as well as the R5 architecture did.

Sunday, March 03, 2013

The rise of the corporate team player (good old TOM)

In case it isn't clear by now, I'll make it explicit: I love the Tridion Object Model API. Sure, our content delivery API is nice, and OData on the content delivery rocks.But me and the TOM was love at first sight (even if TOM never really acknowledged or even winked at me, I'm fine with this one-sided relationship). TOM has matured quite a lot in the past few years, moving from an extremely flexible COM+ architecture to a semi-functional .NET implementation (TOM.NET) to a fully functional one (with Tridion 2011) and with its WCF, service-oriented face ("CoreService") it just shines.

One of the intended benefits of adding a WCF face to TOM was to open up access to our Content Manager core to any client you would want to implement yourself or ourselves (for instance, the Content Manager Explorer and Experience Manager interfaces of our new soon-to-be-released Tridion 2013 use the CoreService to talk to the core), but a few things happened - perhaps unintended initially - with this new-found freedom to interact with Tridion: customers started looking at the CM as something that other applications can talk to.

And Tridion has one functionality that most other applications lack: it can push content to your website. Any type of content. At any time of the day. And though I wouldn't recommend using Tridion to push your 70 million transaction records to your website hourly, it actually fits really well if you don't need to republish your full catalog all the time (which typically is a sign of a bad architecture to start with, but let's not go there). A decoupled WCM will always have trouble with pushing time-sensitive information - like updating the price of 1700 products in the next 2 seconds while at the same time publishing your new 7000 pages website. But we have no trouble at all pushing product info to the website. Even if we don't manage it - or even know it exists beyond perhaps a SKU or product ID.

And I'm not the only one to have realized the potential of this. One of our customers is now routing all 3rd-party content via Tridion (through WCF) for editorial review and publishing to the website, something that was particularly painful for them to do before Tridion was added to the picture. Another one is considering using Tridion as the publishing mechanism for some "records stored in Oracle" that don't need editor attention - but do need to end up on the website, and that's hard to do with their current system. Yet another customer is looking at providing access to Tridion from their PIM, so "collections" of products can be published directly from the PIM's interface (while having Tridion do the heavy lifting).

This flexibility to play nice in the Enterprise is becoming more & more key to today's always on, always connected "modern enterprise" that must deal with aging products that are incredibly hard to upgrade or modify to cope with "simple things" like publishing to a website. But when all it takes is to call a webservice, upload a stream or structured content, and ask Tridion to "Please publish this to our Live site", the whole game changes. Suddenly, it's not black magic anymore to get something on the website.

And I love that too. EAI was my playground before Tridion, and now Tridion is providing EAI-lite features (don't hold your breath on having Tridion handle your bank transactions just yet, but as a publishing service provider for your internal tools it is awesome).

In a recent discussion with a customer we were debating whether to import product info into Tridion or keeping it where it is. The customer's architect was inclined to placing it all in Tridion, while we were telling them not to - probably a surprising action from a vendor, telling a customer to "not use our software to store your info" - but we had a really good reason for this: Tridion is not a PIM. It was not built as a PIM. It does not think as a PIM. So use your PIM for PIM-like tasks.
Likewise, your PIM is not built to publish stuff to the Web or manage it once its live. So don't use your PIM to publish to the web, just ask Tridion to do it for you.

I guess I may be quite a geek, but this idea of using software for what it was built, and talk to other software so they can complement each other, feels just like poetry to me. :-)

PS - It turns out the architect's previous experience with WCM was with a very large competitor of SDL, which has notorious problems playing nicely in the enterprise, and that's why he thought he would have to import all content into Tridion first. He has seen the light now!

Saturday, January 26, 2013

My Tridion VM bootstrap project

As part of my luxurious and privileged seat in Tridion Product Management I get to test new stuff all the time. It's great to just point my browser to the build server and download the latest nightly build (stable or unstable) and start playing around with it - how's the notification area behaving? Can I assign tasks now? How's the workflow engine doing? etc.

While this is great, it has a certain, heavy, repetitiveness to it. Every time I get a new build I need to either completely uninstall Tridion and start again (we obviously don't bother doing migration scripts between minor builds) or start from a completely new VM.

And this means:
  • Create new blueprint
  • Create new schemas
  • Create new content
  • Create new publication target(s)
  • Configure Content Delivery for HTTP Upload
  • Configure Website
  • Configure Session Preview/Experience Manager
Even the most adept Tridion professional will recognize that this takes a LONG time to set up, and all to be destroyed soon, since a new build comes in tomorrow.

So I started, slowly, creating a series of C# command prompt programs to automate stuff for me. Initially these scripts would just create my diamond-shaped blueprint, a set of schemas, and some components. Then, I added a script to import content via RSS so I get some real content in it. Then, I expanded it to also create pages for these components. Then I added configuring the Publication Targets and Target Types. Then I added expanding the Tridion pre-build web applications and copying the configuration for these. Then I added actually creating the sites in IIS. Then I added support for all this to be configured via a XML file. Then I added changing the default templates to include Experience Manager building blocks. And finally I added support for Tridion 2011 too.


And now I decided to share it with everyone. The two projects I use to prepare my environments are available under MIT license on Google code.

There are 2 projects in here: CreateAnEnvironmentForMe and ImportContentFromRss. Each does what its name implies, and there's reasons why they are separate which I won't care to elaborate. I also offer no support whatsoever and really just hope the community can help evolve this solution to something a bit more stable. This code is not intended to be used in Production Environments, and there's no guarantees it will work for you.

These tools are tools I use, and they're fit for me to use them. If you need more rounded corners to use it, please feel free to change it - contact me via this forum or the google project if you want committer rights on the project. I typically run these projects from within Visual Studio, because I expect them to fail here and there, and this allows me to quickly fix them.

Hope this project can help others out there.

Wednesday, January 09, 2013

Whatever the future may bring... will be outdated soon

Had an interesting request from a customer recently:
Ensure that the product will continue to support different form-factors and whatever the future may bring.
How can I answer that with anything else than a very bold "Of course!"?

Whatever the future may bring has been pretty much my domain lately. I knew that moving from Professional Services to Product Management would bring a rather large set of changes. For instance, I can't really complain about feature X not being in the product, since it is my job to decide what goes in the product. Another interesting change is that now everyone thinks I need their advice :)

Anyway, the future proofing of anything we do is not a small challenge, and it applies to everything we do today. For instance, my LinkedIn profile stated:
With 10+ years of experience, of which 8+ as a consultant
Though it might sound interesting, it had been written in 2004, making it quite out-of-date for something that contains numbers (I updated it now), which in turn made me rewrite my SDLTridionWorld profile to let you do the math instead.

So here we are, trying to decide how to future-proof a product, and we can't even write text that won't be obsolete next year. I was privileged to have seen Mike Walsh present at Innovate last year, and one slide that made an impression on me was this one. Yes it does, unfortunately it's a one-way communication channel.

So, how do I prepare for "whatever the future may bring"?

I guess this is one of the advantages of having a relatively light foot-print in our content delivery stack. You want 2001-style XML flat file publishing? You got it. 1999 style JSP with embedded code? You got it. Even (God almighty protect us all) VBScript ASP pages? You got it. What about MVC? You got that too. What about Service-based? Yup, got it (with OData no less).

So, what's next?

I have my own ideas about what's next, and we're working hard to 1) validate those ideas and 2) build it into the product, but can't share that just yet (another one of those things that change with being in Product Management - don't talk about what you can't commit to). Keeping in line with what we built up to today, it will be something that you can extend the hell out of, and will have lots of functionality you will not discover until 3 years later... (WAI anyone?) And maybe, just maybe, we can finally drop Classic ASP support ;-)