Categories for Blog
Getting closer to your audience is something that every marketer should be passionate about, and one aspect of this is a shift to interacting at the individual level with customers, whether they’re authors, readers, students, society members, or institutions. Personally engaging with your users is of course part of this, and it’s rare to see a company in our industry that doesn’t have an active presence on social media to connect with their customers in this way. But there are other techniques to connect your customers with the news, product information or services that are most relevant to them.
Personalised marketing is something that’s gaining more traction as technology gives us new ways to interact with, and query, the data we hold about our customers. Amazon is a great example of how this data can be put to work – the coverage of their databases is so broad and deep that they can recommend products for customers to you based on your browsing or purchasing history. They’ll also use data from millions of transactions to let you know what other people purchased alongside the items in your shopping cart. As a customer, you benefit by seeing other potentially useful items alongside the product you’re searching for – and Amazon benefits with every additional item you add to your cart. But you don’t need to have the same wealth of data at your fingertips to make some potentially game-changing alterations to your customer marketing.
If you’re already collecting customer data beyond contact information – for example, particular product types, subject areas, or event attendance – and if you use email marketing software that supports customised content, you can create highly tailored marketing communications that ensure each customer in your database sees the items that are of most relevance to them. With the average subscriber receiving over 400 marketing emails a month, making your email messages more relevant will help you cut through the inbox clutter and connect your customers with the information that’s of greatest value to them.
Getting started with personalised communication does need a little investment, from effective segmentation through to the preparation of tailored copy – but with customers increasingly expecting that marketing messages take into account the data you hold about them, it’s an area that is only going to become more important.
Last week, I had the privilege of chairing UKSG‘s One-Day Conference on ‘Open Access Realities‘. There was a full house of about 150 people, with librarians, publishers and other ‘interested parties’ such as subscription agents and technology vendors fairly equally represented – this is not always the case at such events, and reflects UKSG’s unique role in ‘connecting the knowledge community’. The programme was also different to many other open access events that I’ve been to, with a particular focus on the practical realities of implementing OA. Although there are still debates to be had at the ‘cutting-edge’ of the movement – for example, in the area of open access to data – it’s also important to step back from the ‘frontline’ and ensure that organisations across the community are keeping up with the ‘baseline’. In my introduction, I suggested that we can compare the progress of OA to Bruce Tuckman‘s model for group development (below): the idea of OA was formed, has been through quite a stage of storming, and we’re now in the process of ‘norming’ – working out the logistics, diversifying its application, taking different routes around roadblocks, trying to pin down a common language, experimenting and developing.
Within that analogy, it’s events like UKSG’s One-Day Conference, that focus on the practicalities, that will help us achieve the stage where OA is comprehensively ‘performing’. I thought it might be helpful to share the points that gave me most food for thoughts on the day:
- Contrary to what many assert, the general public does access and read research content: “If PLOS gets an article on the front page of Reddit, we get 140,000 readers” (Damian Pattinson, editorial director, PLOS)
- Dependent as it is on the subscription publishing model (and publishers’ policies), how can green OA be more than a promotional model during a period of transition? (Lars Bjørnshauge, director of European library relations at SPARC Europe, and director of DOAJ)
- In order for libraries to be able to transition budgets to fund APCs, they should centralise (nationalise?) procurement and management of the core / majority if content that is common across most institutions (Lars Bjørnshauge again)
- Since Finch, there has been more progress on increasing global access to UK research than on increasing UK access to global research. We must be careful not to get too far ahead, and end up bearing a disproportionate amount of the global costs of OA (Michael Jubb, director of the Research Information Network)
- For university leaders, open access (to research publications) is only one aspect of a wider trend toward transparency; the Research Sector Transparency Board is also focussed on open data and data security (equally big, if not bigger, issues) (Adam Tickell, provost and vice-principal, University of Birmingham)
- OA’s facilitation of data mining helps to identify research misconduct in ways peer review never could (Adam Tickell; Peter Murray-Rust later showed an excellent example of this, where a machine reading an article identified a doctored image that the ‘naked eye’ could not see)
- Agile, innovative responses to OA can be better served by a ‘hacker culture’ of small organisations and individuals collaborating than by established organisations where expectations are too high to allow trial and error (extrapolation from points made by Caroline Edwards, lecturer at Birkbeck and director of the Open Library of Humanities)
- Small initiatives can also benefit from extensions of the ‘gift culture’ that exists in academia, where academics are used to giving away their work and time for free (Caroline Edwards)
- Publishers’ perceived slowness in terms of OA adoption in part reflects that “we’re a service industry based on the needs of researchers” and there isn’t yet a clear grassroots demand to help inform the nature of the OA transition (Vicky Gardner, open access publisher, Taylor & Francis)
- Content mining is an important extension of OA rights – publications should be made more machine-readable to maximise their value to ongoing research and application (Peter Murray-Rust, reader in molecular informatics at the University of Cambridge)
People are often considered right-brain or left-brain dominant. This influences how easy (or difficult) they find certain skills and how they view the world. Right-brain dominant people find creative, intuitive, random, subjective activities easier, while left-brain dominant people find logical, sequential, rational, analytical activities easier. Artists are obviously a good example of an extreme right-brain dominance, while a data-analyst is a good example of an extreme left-brain dominance. Most people will find themselves somewhere on the sliding scale between the two extremes but will have a preference or dominance to one or the other. There are tests to find out which you are.
All teams will need a mixture of both left-brain and right-brain dominant members. Marketing has traditionally been thought of as a field dominated by right-brain dominant players, perhaps because of the dominance of advertising and other ‘push’ strategies. It is important to maintain a balance though. For those interested in the neuroscience, there is an excellent TED talk with an amazing insight into the loss of the left hemisphere from Jill Bolte-Taylor, a neuroscientist who had a stroke.
In marketing, both left-brain and right-brain dominant people bring different expertise and skills to the table. Left-brain activities lead to more data driven intelligent marketing; right-brain activities lead to more imaginative marketing. With the last few years of economic woes, the increase in the need to justify ROI has, in some people’s eyes, switched the priority to left-brain skills. Being data driven helps by creating a more targeted approach by using market penetration analysis and defining clear customer segments with demographic and geographic analysis. The right (i.e. relevant) message can then be sent to the right person in the right place and at the right time. Measurement is also an important part of left-brain marketing skills – setting targets, monitoring activity and adapting.
What is less obvious is how right-brain marketing can help drive ROI. A brand strategy that emphasises your difference to the customer and a product development strategy that taps into innovative ideas within your company will also help increase turnover. An effective brand strategy is one based on your organization’s personality, which shares aspirational values with your customers, uses engaging messages, and creates experiences that your customers want. Creativity can also be brought into product development by using techniques to explore new possibilities and challenge accepted norms e.g. Emptying the box and Reversal, and making the leap beyond the obvious to reach visionary, innovative ideas and connecting the best ideas. Right-brain skills also help build relationships with communities and encourage advocacy behaviour.
Infographics are being increasingly used in many marketing contexts, and are working their way into the scholarly information sector. They enable people to evaluate and digest information visually, making it easier to scan and, for many people, more memorable. Infographics can also liven up an otherwise densely text-based document or website, and express something more than a stock image.
What is an infographic?
There’s no strong consensus as to what, exactly, an infographic is. For some people, it’s a diagram that visually represents some data. For others, it doesn’t even need to involve data but can be some artworked text. When I use the term infographic, I mean a graphic where the structural and design elements being used to convey the data are meaningful in themselves, either reflecting either the overall topic of the graphic, or a metaphor for that topic. For example, I’m a big fan of Kester Mollahan‘s “Vital Signs” graphics in The Sunday Times Magazine, such as this one that asks “Who makes the most from the movie industry?”
Top tips for creating infographics
So, how do you go about creating an infographic? Here’s a summary of the process the TBI team goes through when we’re creating infographics. A key point there is the word “team”. It’s helpful to bring together different skills: creative design capabilities are perfectly complemented in this process by data visualization skills, and by broader communications expertise; if an infographic, more than any other, is the “picture that paints a thousand words“, then you need to be clear what those words are before you start painting.
- Digest all the information that forms a background to the graphic, and filter this down to the key points that actually need to be conveyed visually – a common mistake is to cram too much information into the graphic, undermining its ability to convey information clearly and quickly.
- Find the story – if this were text, what narrative would you weave around the facts to get them across clearly and memorably? This critical part of the process is often overlooked, meaning you get into the visual stages without having a clear and simple sense of what you are trying to convey. Creating an infographic without having “found your story” is like playing Pictionary and being given entire essays to draw instead of nice, concise keywords.
- Decide whether the story lends itself directly to a strong visual (as does the example above) or whether a metaphor might be useful to give added visual punch. Statistics lend themselves more readily to direct illustration; metaphors are helpful when you are trying to convey something more complex and abstract, such as the role and function of a service or system, particularly one that might form an essential but not necessarily sexy – or strongly differentiated – part of organizational infrastructure. Talk to the team selling the product / system / service – they might already employ metaphors that you can build on. In the past, I’ve used a rail network to represent a major system consisting of different modules (lines), each of which has multiple features (stopping points) supporting customer processes that, here and there, intersect (junctions). The visual metaphor was then complemented in the surrounding brochure’s copy with verbal metaphors such as “make the connection”, which in turn aligned with the client’s top-level brand messaging about moving content forward.
- Choose an image / shape / visual theme that represents the overall story (or its metaphor) and think about how the components of the story can be conveyed in a way that is meaningful within the overall image or visual metaphor – as in my example above, using the different lines, stopping points and junctions of the overall rail system to represent different aspects of the story.
- Draft your graphic, applying relevant brand guidelines (for colour palette, typeface, balance of white space, etc). Pare it back as much as you can – avoid unnecessary visual detail and edit text elements as much as possible. Test it out on people who haven’t been involved in the design process and who aren’t familiar with the concept being conveyed – are they quickly able to understand what the graphic is telling them? If so, then you have passed the infographic test!
In my last post, I looked at ways to help update your Facebook strategy to better connect with your customers on social media.
Twitter poses a slightly different challenge in that the sheer volume of posts on an average user’s timeline can drown out your message. More than 10,000 tweets are posted every minute by Twitter’s c. 115m active users – and with the average Twitter user following 102 accounts, your Tweet’s ‘prime’ is probably within the first 20 minutes of posting.
Organizational Twitter feeds – particularly those operated on behalf of a journal or publisher – tend to be used mostly as a type of newsfeed to announce the publication of articles, books, journal issues or other bulletins. This means that if you post article or publication links regularly, important news can struggle to stand out in your and your followers’ timelines. With that in mind, some publishers have hit on a creative way of ensuring top stories catch their followers’ attention – linking to the same content at different times throughout the day or week with new descriptive Tweets. You can use tools such as Tweriod to find the best time of day to send your tweets, and schedule them in advance using a social media dashboard or a service like Buffer. Combining these with URL trackers such as Bit.ly will help you gauge responses to help pinpoint which messages and times of day work best for your followers.
Not only is Twitter a good place to start conversations with your customers or members; it’s also becoming the first place they’ll turn to if they have a query about your product, service, or brand – or a complaint.
Dealing with dissatisfied customers is a rite of passage in social media, and though it can be a bit of a minefield, handling a complaint well in a public forum such as Twitter can help increase your followers’ sentiment towards your brand.
It’s important to delineate between genuine complaints and those who just want to ‘troll’ your accounts, though – many top brands’ customer service teams engage with customers’ complaints first via a publicly-visible “@” response, asking for more details to be sent as a Direct Message (or DM) in order to resolve the issue. This shows that they are responsive to customer complaints without cluttering up public timelines with the ins and outs of resolutions.
Alongside Facebook, Twitter is an essential part of any organization’s social media strategy. If you’d like to know more about how TBI can help social media have a greater impact for your company, why not read our case studies, or ask about our staff training programmes?
We made an interesting observation from the results of our Heatmaps survey carried out at the beginning of this year: While publishers consider external-facing brand communications a high investment priority, they do not have corresponding levels of internal investment to support it.
Most publishers appreciate that their brand is what differentiates them from their competitors, and that it can be their most valuable asset. However, a publisher’s brand promise is not just delivered through journals and online platforms, but also through customers’ interactions with its employees.
Putting your employees at the heart of your brand strategy makes for more powerful customer relationships, better customer loyalty and better advocacy. It also ensures that the brand promise to your customers is consistent with their experience. Any company can invest thousands in advertising proclaiming that it is customer-focused, but nothing conveys this more clearly than having a helpful person on the end of the customer service phone line.
The organizations that ‘live’ the brand from the inside out often see greater reward as a result. Virgin is one of the best-known and most successful brands in the UK; it epitomizes how this ‘inside out’ approach can translate into an engaged workforce of more than 50,000 employees in 34 countries into a powerful (and profitable!) brand.
“For us, our employees matter most. It just seems common sense to me that if you start off with a happy well-motivated workforce, you’re much more likely to have happy customers. And in due course the resulting profits will make your shareholders happy.”
Investing in an internal brand strategy can feel like less of a priority than other channels, but having a strong team of brand ambassadors within your company can do more to ensure strong, consistent and effective brand communications long-term, than a whole room full of logo-covered pads and pens.
Here are some areas to consider when implementing an internal plan to support your brand strategy.
• A simple way to communicate the brand – whether this is through a mission statement or a simple brand idea, it is important to check that all staff understand what this means for them as well as the company;
• Communicating the brand internally – Whether it is via the intranet or a brand guide, all staff need to be able to easily identify with the brand;
• Give staff opportunities to “live” the brand – help them find different ways they can express the brand promise through their work;
• Identify brand advocates throughout the organization – not only should management show leadership in living the brand but members of staff (at all different levels) that show particular understanding or empathy with the brand can help enthuse and communicate it to those around them;
• Recruit to your brand – this ensures new employees ‘fit’ with your brand personality, so they find it easy to ‘live’ the brand.
• Know your brand – If your brand doesn’t have a clearly articulated brand idea or widely used and accepted brand platform, then this is the place to start!
Many of you will have been at the Society for Scholarly Publishing annual conference in San Francisco earlier this month. The conference seems to have taken on a new lease of life in recent years, with a growing number of delegates, and an increasingly substantial program (props to program chairs Jocelyn Dawson and Emilie Delquié for a job well done). Of course, much of the business of a conference is also transacted in the discussions that take place in, around and beyond the conference venue – at the dinners, receptions, and even (thanks to jetlag?) the surprisingly buzzy breakfast meetings. So, like all good conferences, SSP passed in something of a blur, but it’s interesting for me to take a step back a week or so later and think about the key points that have remained with me:
Closing the loop was the theme of Tim O’Reilly‘s opening keynote; I think it’s fair to paraphrase it as “using data and technologies to enrich products / services and make them work better”. He gave a myriad of examples of what he means by this, from Google’s driverless car to distributed peer review of open source software. He laid down the gauntlet for the assembled publishers – how can we reinvent some of the more dated aspects of our ecosystem (think Impact Factors, peer review) to make better use of available data and technologies? While throwing out suggestions such as Wikipedia-style revision control, Tim also made the point that scholarly publishers could make more of the opportunities offered by being closer to their markets than some other (trade) publishers (a drum TBI has been banging for a while with our talks on advocacy and relationship marketing – and indeed, I gave a talk on “getting closer to customers” at SSP the following day). He also picked up on the notion that, as we evolve to become more service-oriented, publishers begin to look more and more like societies – so we have a lot to learn from each other. In short, said Tim, publishers need to take seriously the obligation to reinvent the world of information dissemination.
Much of this reminded me of the talk given by David DeRoure at the recent ORCID–Dryad Symposium on Research Attribution, in which he talked about “the social machine” – in which big data comes together with social technologies (and people’s use of them) to overcome past obstacles in creative, intelligent, joined-up ways. If I’ve understood both speakers correctly, then “the social machines” David talked about are examples of “closing the loop”! – and both very inspirational for publishers. Check out David’s slides on Slideshare (“2066 and all that“).
Having heard Tim O’Reilly open the conference, everything else I said and heard there seemed to be shaped by or interpreted within the context of closing the loop. Talks about standards – such as that given by Ringgold‘s Jay Henry, in which he made a well-supported plea for standards such as ORCID to be better used, even mandated, by publishers – seemed to fit well with this theme. O’Reilly’s reference to Eric Ries’ “minimum viable product” also seemed to capture the zeitgeist, with many publishers seeing this as a way to make the most of (seemingly minimal) product development budgets and pursue more innovative approaches to everything from discoverability to video (O’Reilly – again – referenced Lynda.com‘s $70m video training business: “Take video seriously,” he said. “Take small units of video very seriously.” – and of course several publishers are, with Elsevier and IOPP among many who have reported significant increases in content usage driven by video abstracts).
Finally, of course, it’s not just publishers who need to / are innovating – an excellently curated panel session on MOOCs, with a set of speakers from different departments / roles at Stanford University, providing a fascinating insight into what institutions are doing to reinvent themselves and reach wider audiences. I enjoyed hearing that Stanford has appointed a Vice-Provost for Online Learning whose mission, among other things, is to “unleash creativity and innovation in online learning”. An aspiration for all of us, perhaps!
I gave an introduction to Open Access at a UKSG seminar in London last week and one major issue jumped out for me. Open Access (OA) may have been around for some time but buzz around this topic has suddenly taken off. The conversation isn’t confined to list serv chatter either; it has reached the national news in the UK and US. But it’s very top-heavy discussion. Let me explain….
Policy makers across the world are issuing directives and funding bodies are responding with mandates that ensure research they support is made Open Access.
But what of the researchers themselves? Where is the grassroots response? Well, the Cost of Knowledge boycott against Elsevier shows perhaps they do care in principle. But you couldn’t say uptake of Open Access is rocketing. Research shows that authors are more concerned with the speed of publication, standard of peer review and citations or impact of the journal; and are worried about plagiarism, predatory publishers, myths about poor peer review and the perception of the value of their work being affected. And this brings me to my point, there is a lot of ‘stick’ out there to compel researchers to publish via Gold Open Access or deposit their article in a repository (Green OA) but there’s not a lot of ‘carrot’.
There is an opportunity for publishers, librarians and service providers to support authors with OA and help them get the recognition they want for their work. There are some very simple ways to enhance the author experience, such as the author feature in Bone & Joint.
In the UK the RCUK mandate hasn’t yet shown a huge sea of change in how researchers publish their work, so it will be interesting to see the response that the White House Directive gets in the US and how Horizon 2020 will effect the EU. I’m not holding my breath though; OA uptake for Nature Communications actually fell from 60% to 30% in 2013, so it may take some time. Of course we also mustn’t forget that not all researchers are influenced by funding agencies: In fact, in a recent OASPA survey only 23% humanities & social science authors said they have research council funding. So am I waiting in vain? Perhaps it will just take time before a fleet of authors thirsty for OA appears over the horizon?
The concept of the web in your pocket is far from a new one – WAP and iMode have been available on feature phones since before the turn of the millennium – but the advent of the smartphone, and fast 3G and 4G data networks, have resulted in a fundamental shift in how people find and consume information. And if you’ve noticed family members, colleagues, or fellow commuters poking at a portable screen of some sort, you’ll probably not be too surprised to learn that mobile makes up at least 10% of all internet traffic, and it’s growing fast.
Understanding how mobile changes an audience’s experience – and expectations – of your content is a challenge that many publishers are still catching up with. Some publishers, such as the BMJ Group, have developed Apps that act as windows onto their content, allowing subscribers to access information on the move; however, libraries and publishers alike frequently run into administrative roadblocks when trying to cater for remote users, as Judy Luther discusses on the SSP’s Scholarly Kitchen blog.
A large part of the challenge in adapting content for mobile is the myriad hardware configurations that your customers are likely to be using. Device-aware site designs that automatically display for visitors using a mobile device might be ideal for pocket-sized screens, but impractical on a tablet; responsive web design overcomes such issues to provide an optimal viewing experience on any screen size but can make for higher testing and design overheads; and platform-native Apps can play to the strengths of individual devices, but can be very expensive to develop and support.
Our upcoming TBI Masterclass, Going Mobile, looks at ways in which the rise of mobile will affect how your customers interact with your content. For more details, and to register for what is quickly becoming our most popular Masterclass session, click here.