Iona Townsley says “My additional insight is that digital PR campaigns need to work a lot harder than they have done in the past to achieve the same kind of coverage.”
Okay, so you've said that you're seeing less impact from digital PR campaigns. And that has led a few people to be a bit more reactive. But you also say, we've talked about this a little bit beforehand, that you need to focus on many different areas, watertight data, strong methodologies, great copy, validating them before launch as well. So, let's dive into those areas. What does watertight data actually mean in practice?
“Data is a really big way of finding a PR hook, initial PR and telling a story, we see a lot of kind of data related essays, or studies or indexes, that tend to work really, really well. But we found that more recently, the data that is being pushed out, might not be as watertight as it needs to be. So, I feel like digital PRs been inspired from those big studies, like the World Happiness ranking, where they've got tons of experts. And they've got this really watertight, strong data that we're working with, to create an annual report. And those kinds of projects get lots of pickup in the press. And I feel like that's where kind of digital PR has been inspired by creating these data stories from the beginning. But the issue is that because in digital PR, in agencies, you don't have enough time necessarily to put together the stories like the World Happiness Report, and we don't have that much access to experts. It leads us to try and do quick, fast data. And a lot of the time, we're seeing that journalists are actually been able to see through the data that isn't as strong as it should be.
So when I say watertight data, I mean data that's coming from really reputable sources. And this is something that you need to just have a competency in yourself of finding out what data is actually giving you the right kind of information. You need to take an extra step rather than just taking that data because it's stuff that you need for your project, and working out if it's the right kind of data that you want to be pushing if it's reliable data, or if it's coming from a source that you can trust. With watertight data as well, you need to make sure that the methodology is super, super strong. I'm seeing a lot of the time now that people are posting digital PR campaigns. Sometimes they're not listing the methodology when they've got entire stats and data that they're telling a story and that they're pushing to journalists. But they're not actually saying where their sources are from, or how they've actually calculated an index. So that loses a lot of trust when you're pitching to a journalist, because if they can't understand why you've done it in a certain way, then how are they supposed to tell their readers that this is reliable data? Watertight data is one of the things that there's a lot of talk about it in digital PR, especially over the past few years. It's something that we need to keep to a standard. And not thinking that because we don't have enough time that we can't do it. Maybe some of us aren't as competent with data. Because data is its own kind of skill in itself, you can't expect everyone to be really good at data. It's just so hard to keep on top of. And I think as digital PRs, we need to be really kind of on top of that. But also, if you're in a position where you're not the one dictating how projects are managed, or how the PR strategy is managed, you need to push back a bit more. I think that's one of the things where we need digital marketing campaigns to work harder.”
One of the things that you touched upon there was reputable data. So, what's an example of a reputable data source? Also how do you decide upon the kind of data that you want to collect?
“So, a reputable data source would probably be stuff like the US Census data. This is where the US government go out and then they find lots of information about people where people have to respond to the census. You've got millions and millions of people giving information to the government that they then present in the census that then you can go ahead and use. That's reputable data, because it's coming from a trusted source, the government, and you've got tons of sample data. So, there's millions instead of for example, there's lots of data sources out there that I won't name because I feel like I shouldn't, where people will reply to something or say this is how much I think X is but then you find out that only 50 people have responded. And if you're saying that in Slovakia 50 People have said that the price of a pint of Guinness there is two pounds, 50 people out of the entirety of Slovakia is not a lot at all. That's not reputable data.”
So, what quantity do you have to cross before you get the likelihood that you actually might be considered as using reputable data?
“I've been doing a lot of research on this, because I've got an issue of survey campaigns as well, which I know look like survey campaigns work really well. There's kind of an amount, it depends, I guess, to an extent what you're personally comfortable with everyone's amount of data. I don't know how to explain it, but some people are happy with survey data. And you have to think about what a journalist is going to be happy with for what that campaign is going to achieve. And if you leave it at that, then you're going to be a successful PR, because you're getting coverage from journalists. Say 2000 people is enough to have as a survey. But journalists sometimes also aren't that good at being reputable with their own data, they don't check enough where maybe they should do. So you could have your own personal bracket of how many people you need for that data to be reputable. And I think it's an important thing to note, because we're always doing stuff that are going to get picked up from journalists. That's what we're trying to do, but I feel like that's again, why people are starting to not kind of pay attention to the data or having enough reputable data because journalists are starting to pass through that kind of lower level of content, where they're not necessarily checking or they don't mind that only two people have said this one thing. You'll see it with The Daily Mail as well where they'll say Twitter is in uproar, because Megan Markel has done this. But then, when you actually try and find the information, there's only one or two tweets of somebody saying that. They're not being reputable with their own data. So, it's hard for people to be kind of keeping up those data standards themselves. But it's completely different based on the kind of projects that you're going to push, whether it's a survey whether it's gathering data online. But as a standard rule, getting as many people as possible, is the best way to go about it, and doing it to your own personal ethics, but also to the ethics of the journalist.”
You also mentioned that it's good practice to incorporate your methodology for collecting your data within your release. So what's an example of a good methodology and how should you actually include that as part of your release?
“I think it's important to note that the methodology shouldn't just be in the release, it should be in the kind of content that you've got on the website. So, if you're presenting a study on the website, your methodology, anywhere that you're talking about that study, or whether it's a dedicated webpage, whether it's a press release, you need to have your methodology super, super clear, or a link to the methodology so people can start to look at it in kind of like with a fine tooth comb. I've always said that if your methodology isn't clear enough for someone else to copy, then it's not clear enough at all. But I don't know why people aren't putting in proper methodologies. I don't know if it's just because their time straps. I don't know if it's because they're scared about other agencies copying their methodologies, which I know some people are worried about. That's why a lot of people don't actually share their work because they're scared of other people copying them. But the methodology is there to give people more of an insight of where you found your data, or why you're telling the story that you're telling. And at the end of the day, you need to pass that methodology on to the journalists. So everyone's going to have that methodology somewhere. But we're finding that they're not presenting it as much on their website. But in general, the methodology should clearly state what you've done, how you've done it, linked to the sources, and I guess the date that you've actually captured the data as well, you should put as much information on there as possible. Maybe not saying step by step how you pulled search volume from Ahrefs, for example. But making it clear what Ahrefs is, linking to it, that kind of thing.”
You also mentioned the fact that great copy is something that needs to be utilized in 2023. So, what constitutes great copy?
“It can be different to different people. But in terms of if you're presenting a data study, a good way to look at it is to make sure that everything that you're writing is clear and informative. So, at the top, you might have things like key findings to pull out the top kind of stories that you're going to end up pushing to journalists. This could be particular large countries or large cities that you want to focus on any surprising stats, that kind of thing. Then also on the landing page, the way that you set it out, is very important. You want to make sure that you've got headers, that you are pulling up most interesting stats, that everything is really clear. I think with copy it's very easy to kind of start fluffing things up, which is not what anyone wants, so it's just such an easy mistake to fall into. If you're a journalist or just the average reader falling onto that page, you want to get the information as quickly as possible, you don't want to start reading these different tangents about the data. I think with a copy as well, it's good to add additional references. So whilst you've got your data study, and you've got the kind of key stories that you want to go out with. It's good to add in additional sources, they might be stats to support your story. It is to show that while you've created this data story, it doesn't exist in a vacuum, there's other things backing it up, or there's other things that make it more relevant and to why it should be pushed out today.”
So, what's some general great advice, that you could offer on writing a great headline and writing a great introductory paragraph?
“I think writing a great headline is an interesting one. A lot of the time, the headline in terms of the landing page copy is going to be very similar to the subject line that you're going to push out to a journalist when you're outreaching. So, with ideas, for example, you want to make sure that your idea fits into one or two lines when you're pitching to somebody. Make sure that it's clear and concise. And then that's going to probably be the one line that is going to get pushed out into the landing page and into press. So, making sure it's super clear, making sure it's super concise. In terms of it being the header of the landing page copy, I just say like a really simple kind of overarching view of what the project is. So, “these are the cities ranked on blah, blah, blah”. I wouldn't dig into say “London is the best city for x” because then you're starting to give away your data, you want to give people just showing them what it is not start telling them the data before they even want to know if they want to read on. And then for subject lines, I guess because you're saying headline, it's making me think of subject lines and landing page headers. For the subject line, I guess you just need it to be super concise. Again, make sure it's something that the journalist is going to be interested in, if it's regional outreach, picking out key cities, or counties or whatever it might be that you're outreaching to. But the main thing is just keep it super clear of what you've looked at, and super concise.”
So, using your specific example, instead of saying “London is the best city for x”, you would say, “a city for X revealed” or something like that?
“If it was a subject line, “London is the best city for X” is definitely something that you could use, especially if you are outreaching to London specific press. If we're talking about the landing page copy and the header, I just put something simple, like “the best and worst places in the UK for”, I don't know if I'll think of anything off the top of my head, but you get it.”
And moving on to validating them before lunch. Well, what does that mean?
“Validating your idea is making sure that what you end up going forward with, whatever you end up pushing out, is ideally going to work. So if you have an idea that you start working on, and you've got all these questions in your head, and you're not sure if it's going to work, you've got a bad feeling about it, whatever it might be, you should not go ahead with the idea or that project until you know for a fact it's going to work. It's fine if you push out a project that you think is really going to work, and it doesn't, it happens, sometimes you can't really help it, it depends what's going on in the news. But if you go out with a project that you don't think it's going to work, it's probably not going to work. So that's where validation comes in. You start asking yourself questions, and making sure that what you push out is to the highest standard that you possibly can and that you know journalists are going to be interested in it.”
Okay, so you're just asking yourself questions, and perhaps your team questions, you're not necessarily doing a soft launch just to a few journalists and getting their feedback?
“Exactly. So, it's more so at the ideation stage, you'd ask yourself a list of questions just to make sure that it's going to pass them up. But you can validate in different ways. In the past, we've used Reddit before, especially for really big data stories where we've got lots of moving parts, maybe there's lots of cities or countries involved in it. Reddit is a really good way of getting that initial feedback. They'll pick up easily on anything that they don't think is correct. If you said, people in China sit down to pee instead of squat or whatever, people will be like “no, you're completely wrong”. And then that's another way that you can be like oh, actually, there's some key issues here. Have we just mixed up the data in an Excel sheet? Is the data genuinely wrong? Those kinds of things. Reddit is really good because people don't actually give a shit about your project. They do not care whatsoever, who you are or what you've done. They're just looking at the data and they'll pick it apart, which is kind of beautiful in a way.”
So, you've shared what SEOs should be doing in 2023. Let's talk about what SEOs shouldn't be doing. What's something that's seductive in terms of time but ultimately counterproductive or something that SEOs shouldn't be doing in 2023?
“I think, looking at campaigns and formats that have worked in the past, and assuming that they're going to work exactly the same in the future. I think we had a bit of a golden age, which is probably around the time that I started in digital PR. I felt like it was super easy when I started, which is, I guess kind of bad to say, but the same formats would get you really high results. So, people were doing the same things all the time. I've noticed and during Coronavirus, there was a huge shift in how journalists were dealing with pitchers in the kind of content because everyone didn't know what readers wanted. Found out that positive content was going to do better than say negative, shock. But that's kind of how things went. And then the past few years, we've still been trying to work out what journalists need, what they're looking for, which has made people struggle. A lot of hero content isn't performing as well as it has done in the past. I think it's because we're assuming that the same formats, the same things that we've been doing are going to work again in the future. And they do in a sense, for example, with map campaigns, the format works really well. But that's because it gives you so much outreach opportunities. You can go to lots of different countries, you've usually got multiple angles, but say the experiment where you find out how dirty something is compared to a toilet seat, I've seen a lot of those go out more recently, and one or two have worked really, really well. Its’ve hit the nail on the head, but a lot of them haven't. We just need to be really mindful about what formats, what types of campaigns are going to do well. But I think that also means that we need to open up ourselves to experimentation a lot more, which is hard to say when you work in an agency. You have a client and they've got a minimum amount of links that you've promised that you're going to give them or something like that. It's hard to say, we're going to try this new thing, it might completely flop, but trying new things is going to keep you ahead of the industry, or failing that, finding a format that's been proven in a different industry, whether that's a data analyst at the New York Times has tried something and no one else before has yet or you know something along that lines where you can experiment but be comfortable knowing that you've got something to prove that it might probably work.”
Iona Townsley is a Creative at NeoMam Studios, and you can find her over at neomam.com.