Categories
Morning Coffee

‘Sharp Eyes’: China’s Truman Show

China's Truman Show Surveillance: 'Sharp Eyes'

As countries worldwide look to loosen the democratic restrictions on public life, China looks to strengthen its position, according to a new five-year government plan.

Thumbnail Artwork helping to visualise China's mass surveillance and it's program, Sharp Eyes.
The state is dangerously turning into a Black Mirror episode that can't be switched off.
Harry Austen via Unplash
Harry Austen

by Harry Austen

Published March. 4, 2021 GMT

China is open and booming. After a short lockdown, with a global virus stated to have started between its borders (Wuhan), Chinese residents are free to travel, work and mix without any restrictions.

While the virus may be one issue solved concerning public health in the nation, its ramped-up plans surveillance may make it an apt competitor.

(u) Smaller image
President Xi Jinping reacts to an adoring crowd at The Great Hall of the People, ahead of China's People's Political Consultative Conference (CPPCC).
Kevin Frayer/Getty Images

Governments track residents within their jurisdiction. In doses and internally regulated to avoid public safety breaches, it keeps people safe and provides urgent information to spot potentially dangerous individuals. Governments deal with bomb and terrorist threats that we, fortunately, never hear about regularly – it’s one of the responsibilities that make the position challenging. It’s also why the democratic system is essential to keep the government accountable for its actions. And it’s why countries that are not that are crucially dangerous.

China’s 2016 five year plan aimed to have government surveillance covering “100% of the state.” They didn’t reach that goal, but they were close. The project is named ‘Sharp Eyes’.

The object detection and tracking technology developed by SenseTime Group Ltd. is displayed on a screen at the Artificial Intelligence Exhibition & Conference in Tokyo, Japan, on Wednesday, April 4, 2018.  Photographer: Kiyoshi Ota/Bloomberg
This tracking AI software, shown in at a conference in Tokyo, is the same hardware used in China's mass surveillance systems.
Kiyoshi Ota/Bloomberg

This tracking AI software, shown in Tokyo’s conference, is the same hardware used in China’s mass surveillance systems.

China started its surveillance in 2003, with ambitions to block outside digital interference to the rest of the world. The initiative, official entitled ‘The Golden Shield Project’, is built off the back of hundreds of millions of personal information provided by Chinese citizens. The ‘Great Firewall of China’ (unofficially dubbed) clearly outlines the governments’ earliest intentions to stop any form of communication against the state. The project was a success.

During a similar timeframe, the state released SafeCities. This project focused more on logistical issues in regards to surveillance. The two projects continued in collaboration when, in 2013, China formed SkyNet. SkyNet targeted urban areas and involved placing hardware additionally on security gate cameras that require facial recognition.

According to Dahlia Peterson, spokesperson for Georgetown University’s Center for Security and Emerging Technology, “Chinese state-run media has claimed Skynet can scan the entire Chinese population in one second with 99.8 percent accuracy”. She claims that this does not seem possible as it “ignore[s] glaring technical limitations.”

According to Peterson, explains that the technology is bespoke to each town’s needs. A city can also propose additional measures. Split into regional grids, in a multitude of ways, means the government can essentially A/B test one town to the next in respect of security.

China also doesn’t seem discouraged to share their intentions (at least the outline — not the tech). During HBO’s special, ‘How China Track Everyone‘ (2019), the lead investigative reporter, Elle Reeve, is introduced to engineers behind the technology. The special shows off the clever use of AI dictatorship, like having your face plastered on a massive screen if caught crossing the road illegally, to the more worrying elements of face-ID being used to identify any individual in a mass crowd of thousands.

(u) Gilles Sabrie for The New York Times
As displayed above, government officials can identify walkers-by via CCTV cameras throughout most streets and rural regions.
Gilles Sabrie/The New York Times

During the same special, the VP and Spokesperson for Megvii Technology Co., which has received a $1.4B investment (primarily government capital), was quizzed on how he sees the future. Xie Yinan: “Have you seen the film’ Black Mirror?’ You stand there and on your face there are points. Every person has their social points I think. Maybe that is the future.” He was close.

‘Social Credit’, a measurement that serves as your Digital score, essentially acts as a measure of liability for most of China’s 1.4B citizens. It’s unclear when first ideated, but even before the interview mentioned previously, the state has planned this for some time. Negative behaviour, such as playing loud music or eating on transport, will be recorded via the immense chasm of AI at play in almost every region in the country and cause a drop in Social Credit. This level of surveillance requires an awful lot of cameras, which is growing by the year.

The highest number of operating CCTV cameras per Country

In Millions (M), the graph shows the number of CCTV cameras per region.
Data via Local Authority/Gov Statistics

The highest number of operating CCTV cameras per Country

In Millions (M), the graph shows the number of CCTV cameras per region.
Data via Local Authority/Gov Statistics

While it is important to note that clearly, China is the largest populated country, the strides and use of intrusive technology are uniquely representative of the country’s vision.

Planning to increase the number of cameras to 600M within the next year, soon the plan to reach 100% surveillance coverage may actually turn into a reality. And while right now, this shouldn’t cause concern for citizens living within the confines of a democratically elected sate, China has and will try to export its top-quality tech to suitable buyers. It’s a matter of which country is first to fall.

NEXT UP

Continue reading about Sharp Eyes...

This article is part of the Morning Coffee collection. This section of the blog features articles to accompany your morning dose of caffeine, quick and easy.

Harry Austen is a Data & Search Analyst. He has worked with the likes of Disney, The Olympics and Zoopla. @austenharry  

This page may change periodically, as and when more information becomes available. Therefore, please remain patient if you are experiencing longer load times than normal. As this article is subject the above, please also be patient for immediate updates.  The Blog | Subscribe

Categories
Morning Coffee

The Jeff Bezos Empire

The Jeff Bezos Empire

On Feb. 2, 2021, Jeff Bezos announced he would be stepping down as Amazon CEO. As a man known for his intricate planning, the announcement was shockingly out of left-field, leaving many to ponder his plans ahead. An excellent place to start, to figure out what’s left to conquer for Mr Bezos, is by reviewing what he already accomplished.

Jeff Bezos
Bezos' departure will give him more to spend on humanitarian pursuits.
Harry Austen via Unplash
Harry Austen

by Harry Austen

Published Feb. 10, 2021 GMT

Updated Mar. 6, 2021 GMT

Jeff Bezos investments have accumulated over $200B. As such, this article breaks down key projects and initiatives from his investments in companies and subsidiaries, rather than a full rundown. While his name will be edged into technological innovation for Amazon, it’s worth highlighting the lesser-known schemes that have driven personal and industrial economic growth.

It’s worth noting that while this piece will broadly cover some financial nuances relating to stock and share price, the focus has been made on advancements in tech over capital gains. The interesting details are the processes that have lead to such wealth.

Jeff Bezos investments since 1994 - 2020

At a glance - Timeline-01
Key initiatives from Amazon, 1994 – 2020. Whole Foods included as Amazon’s biggest investment ($13.7B).

In a sea of innovation, one thing is clear about Bezos; The investment in customer experience (making it easier for consumers to purchase) has been an ethos carried throughout his career. Bazos started as an online book salesman to the greatest product and retail salesman of any generation.

Another point is also exact; the timings of said investments have been inch-perfect. In a similar vein to Apple, the company took hold of the industry by simply addressing customer journey and delivery issues. And the more this happened, the higher the bar became. So much so, that users now expect next-day delivery on everything they purchase online. Consumers’ mindset has been changed, playing into Amazon’s hands with Prime membership and insanely short purchase funnel. One in which, that turns into itself across an array of industries.

It’s a tough call, seeing as the sheer number of ‘right calls’ Bezos has made paid off, but Prime may be the biggest innovation out of the list above (for the moment.) Additionally, not only were they first to market for the majority of their sectors, but the act of releasing Prime was also the start of un-humanly power in the market. Doubling eBay, it’s one major competitor, by the week at this point investors really started to see a return. With customers hocked on the idea of having a package delivered the next day, sold and delivered (literally) the same experience you would have at Christmas, the product of brand loyalty was sold perfectly.

With Prime, purchasing online didn’t just involve price – which the company, albeit not all the time, does a job at, but equally the lure of serving you the image of opening a present. This idea, privately, stuck with consumers – ordering for the sake of ordering. And in a world that questions most, the ability to have user base, responding in the way it did, payed dividends.

NEXT UP

Continue reading about jeff bezos investments...

This article is part of the Morning Coffee collection. This section of the blog features articles to accompany your morning dose of caffeine, quick and easy.

Harry Austen is a Data & Search Analyst. He has worked with the likes of Disney, The Olympics and Zoopla. @austenharry  

This page may change periodically, as and when more information becomes available. Therefore, please remain patient if you are experiencing longer load times than normal. As this article is subject the above, please also be patient for immediate updates.  The Blog | Subscribe

Categories
Morning Coffee

The Debate: Digital PR vs Technical SEO

The Great Debate: Digital PR vs Technical SEO

With recent comments from SEO’s favourite Google Search advocate, John Mueller, the industry has started to debate Digital PR’s magnitude; what the term means and its significance vs technical initiatives.

Digital PR article, visualised with Thumbnail Artwork detailing the power of Nike's brand image.
Digital PR is widely misunderstood as another link building strategy — Google only care about links if they correlate to brand.
Harry Austen via Unplash
Harry Austen

by Harry Austen

Published Feb. 4, 2021 GMT

Updated Mar. 6, 2021 GMT

The discussion started when John Mueller replied to Rise At Seven’s Creative and Digital Director, Carrie Rose, in regards to the growth of PR efforts in the context of Search.

At least to my record, these comments are the first time someone in Google has pointed to Digital PR’s legitimacy. We have, of course, heard spurts of significance attributed to link building. But we must not conflate the two. Link building and Digital are not the same things. The former is generally a tactic for driving links (white hat or otherwise, if you’re planning to put yours or your clients’ site at risk), and the later is how your brand appears (positively or negatively) to Google. Note: Links are not an essential metric to the description of Digital PR. How Google works this out another discussion, but for this piece, it is necessary to note how other websites talk about your website written about on the web – what words surround your brand name when mentioned, holds significant value to Google and where it positions you on SERPs. For example, the easiest way to imagine this is a scatter graph with dots indicating positive text relating to your brand existing on the web and harmful drops towards the chart’s lower end. In the form of thousands of mini-algorithms, this is how Google works out your brand equity.

The example used above is a simple example of how Google’s mini algorithm for this factor works. I say factor because, ostensibly, that’s how Mueller describes it. Notwithstanding the last comment, if it isn’t a factor, it’s a metric Google uses to establish where a domain sits regarding the brand position and, to most certainly, standing.

Based on Mueller’s comments, it appears to have an equal or higher level of importance, “in many cases”, for webmasters. The debate between the two sides, both Technical and PR, has been interesting to watch in a Digital showdown. While technical SEOs raise strong arguments noting that essential parts of a website function reside under the ‘tech’ umbrella, many seem to conflate Digital PR with link building – still. It may be the case of the term being somewhat of ‘dirty word’ for hardcore SEOs. Regardless, it’s clear from these comments that Digital PR, whether you like the name or not, is hugely important. Once again, so there are no disparities, Digital PR is not link building – it’s how Google views your brand. And under that description, it’s crystal clear that the term will be here to stay and should be one that SEOs start to look at legitimately.

This article is part of the Morning Coffee collection. This section of the blog features articles to accompany your morning dose of caffeine, quick and easy.

Harry Austen is a Data & Search Analyst. He has worked with the likes of Disney, The Olympics and Zoopla. @austenharry  

This page may change periodically, as and when more information becomes available. Therefore, please remain patient if you are experiencing longer load times than normal. As this article is subject the above, please also be patient for immediate updates.  The Blog | Subscribe

Categories
Morning Coffee

Google’s Core Web Vitals ‘Page Experience’ Factor

Page Experience, one of Google's new ranking factors under the umbrella of Core Web Vitals, unveiled with additional time

Google recently shed details on it’s new ‘Page Experience’ metric, delaying the roll-out, marking May, the new deadline for webmasters.

Page Experience
The update is a continuation of the previous warning from Google to webmasters to ensure their site is in order ahead of the deadline.
Harry Austen via Unplash
Harry Austen

by Harry Austen

Published Jan. 19, 2021 GMT

Updated Mar. 6, 2021 GMT

On May 2021, Google will start rolling out Core Web Vitals and taking action against sites not cooperating with a high user experience level. This marks an additional six months to prepare, since Google’s first announcement. As such, with all things Google, the update came with another nugget of more improvement for UX on the front end of the search engine itself. With a new visual indicator’, Google will signal to users if a site has a poor performance score as of May.

The thinking is that as a user, you’re more likely to click on pages that have the highest score or visual marking. This acts as a full stop to the factor in the same way that HTTPS has and did when it was first released. Google gave all sites time to secure to the new protocol and those that didn’t either saw a significant drop in rank and/or was labelled as unsafe for transactions off the batt ahead of a user entering. A similar pattern is emerging here; it seems.

From what’s listed on the release and what Google has done historically, I would likely expect either nothing or a tiny drop in rankings, at least for the few weeks while it’s servers and engineers fix bugs in the system. When you go from theory, plan, test then release — to billions of pages across the web — it takes time to figure itself out. Even with Google’s machine learning capabilities, there are likely to be some minor discrepancies here and there, but it’s always worth noting that this is par for the course. Additionally, while others may suggest not to worry about, I would most certainly at least anticipate the on-set effects against a potential Google scoring system. In other words, take a stab at what you believe your score will be out of ten. This is a more straightforward example compared to what is actually in place. With that thought, if a site does not score above 7, you may want to think around ways of improving UX, such as remapping your site taxonomy to more evenly spread authority using wireframes. Google has several great resources if you are just getting started. The linked page provides you with the basic UX housekeeping (terms, use, Search’s role).

With unprecedented roll-out announcements, the search engine is attempting to give webmasters time to make relevant changes. Now would be a great time to innovate a new web template. Page Experience, alongside general user experience, will help inform Google’s concurring opinion, which ultimately results in where the company positions you on the results page.

This article is part of the Morning Coffee collection. This section of the blog features articles to accompany your morning dose of caffeine, quick and easy.

Harry Austen is a Data & Search Analyst. He has worked with the likes of Disney, The Olympics and Zoopla. @austenharry  

This page may change periodically, as and when more information becomes available. Therefore, please remain patient if you are experiencing longer load times than normal. As this article is subject the above, please also be patient for immediate updates.  The Blog | Subscribe

Categories
Morning Coffee

Technocratic Anarchy: Big Tech & Disillusioned Voters

Technocratic Anarchy: Big Tech & Disillusioned Voters

In light of the notorious Capitol takeover, this piece takes a stab at big tech’s positioning in the political landscape to see how Twitter, Facebook and Co. gain stronger control over democratic policy.

Thumbnail artwork visualising that Trump has been silenced by Big Tech, issuing complaints of technocratic anarchy from both sides.
For once, Trump is silent leaving the world to debate both his and Big Tech's actions to permanently omit him from their platforms.
Harry Austen via Unplash
Harry Austen

by Harry Austen

Published Jan. 10, 2021 GMT

Updated Mar. 6, 2021 GMT

Since writing, more organisations have sided with Twitter and followed their lead, cutting more ties with Trump. Those are below.

  • Jan 8, 2020 — Twitter has confirmed the removal of the account indefinitely.
  • Jan 9, 2020 — Google has suspended ostensibly right-wing social media app, Parler, for inciting acts of violence.
  • Jan 10, 2020 — According to Shopify, the company started removing stores selling Trump-related merchandise two days before Twitter’s announcement.

On January 7, 2020, the soon-to-ousted President Trump hosted a rally in a last-ditch effort to overturn state legislation, agreed by most state officials, both Republican and Democrat. This was at noon. By 2:00 PM, chaos had ensued. Not only had rioters made there made into unrestricted paths of the Capitol, but a number of them made there way past the outnumbered security and Capitol police into the Senate and other private areas of the building. Once evacuated, the damage could clearly be seen in the form of smashed windows, stolen and vandalised office and desk equipment, as well as more brutal damage to the physical confines of the building and chambers. In a nightmarish metaphor, the architecture of the Capitol was literally on shaky ground. Unfortunately for the one-year term President, the shaky ground was not in the form of State corruption as he argued, but from his loyal fan base following, in the broadest sense, orders.

Big Tech's response to Capital Anarchy
Scenes around early afternoon — protesters, loyalists, superfans and on-watchers gather in mass numbers to unnerve the Capitol residents.
Harry Austen via Doug Mills/The New York Times

It would be fair to say both Twitter and Trump haven’t had the steadiest of relationships. The Floridian has frequently taken jabs at its owner, Jack Dorsey. The President has the perception that the social media giant controls communication between himself and the United States people. This is all being said, mind you, while sending said tweets on the platform he is orchestrating — to him tens of millions of followers.

This all while he fails (or ignores) to see and admit the power he has at the push of a button. As displayed on Jan 7, the power this man has, who enjoys the intensely erroneous smell of chaos around the Capitol corridors, is a danger. If the display of terror was not enough to convince Trump, who in similar instances played coy for as long as he could before deciding it was time to pick up his action figures, then surely nothing will. At this point, the war you would need to convince him of his power would not be worth doing.

The President has run roughshod over the constitution, attempted to overturn ruling laws that have and will stand the test of time, and even turned on his one of his own, Mike Pence. His running mate’s exile was done at the flick of a switch and should give us an insight into how his administration may have run throughout his tenure. Now, in what has to be a surprise to Pence himself, he is being asked to force Trump out. Notwithstanding the chaos, the most interesting point remained his run-ins with social media organisations and big-tech.

Trump isn’t just the President. Or at least, he wouldn’t just want to be considered that – he’s a celebrity. And when you are a celebrity, you are afforded several additional privileges. Clearly, the higher the wealth and fame, more of these present and manifest themselves. In Trump’s case, his closest touchpoint to bask in that fame is sending out a tweet. And while every president who has existed in parallel with Twitter’s existence has used the set Presidential Twitter account (@POTUS), Trump refused. If anything, he understands the power of social media and, politics aside, the man has a knack for media and marketing.

However, on January 8, 2020, following the heinous acts detailed above, it did not matter. His self promoting speaker that omits for hundreds of thousands of miles was switched off. Jack Dorsey, and the other Twitter Executives, ultimately found that the Capitol’sCapitol’s violence was getting to a melting pot. His silence on the matter I’m sure only egged on the protesters to continue, assuming all would go to plan in Camp Trump.

Once Twitter fired the first shot, others such as Facebook fell in line. Facebook has concluded that he will no longer be allowed to use Facebook in any form until the end of his Presidential reign. According to Zuckerberg, he “believe[s] the risks of allowing the President to continue to use our service during this period are simply too great. Therefore, we are extending the block we have placed on his Facebook and Instagram accounts indefinitely and for at least the next two weeks until the peaceful transition of power is complete.”

One of the biggest complaints to suspending Trump, which is hard to disagree with, is, “who put these social media sites in charge to decide what the highest-ranking official in the West can act or notify the public?”. At a point where it was quite literally damaged limitations, with the correct decision made. But again, notwithstanding the previous comments, should Mark Zuckerberg or any social media CEO have the ability to turn your lights off? It’s a debate that will continue to a solution in the form of legislation.

Big Tech: Mark Zuckerberg before Congress
Scenes around early afternoon — protesters, loyalists, superfans and on-watchers gather in mass numbers to unnerve the Capitol residents.
Harry Austen via Bloomberg

Conclusion: Big Tech's Growing Influence

It may not sound very optimistic, but these companies will ultimately take the crucial steps they need to profit. In fact, with the level of shareholders involved, put in the same position, I’m sure we would all want to get our company involved in the conversation of all future profits and potential revenue. The seat at the table, as it were, will without exception include and or be lobbied in favour of big tech. In terms of a few companies being able to silence your voice online collectively (which was done under the correct terms by Twitter), the issue at hand is not a fault of the company but the legislation surrounding the basis of that decision.

In other words, Twitter and any other platform have the legal right to remove any account under their terms. However, because there is no online law to dispute against social media companies, Jack Dorsey and Mark Zuckerberg effectively play god in their hard-coded heaven. Is that fair? Well, yes. A service agreement that is too long and never read, which we agree to when signing-up, gives them that right. The question should also be is it right. In respect of Trump, most probably. With that being said, the later still needs more work. In the next few years (and beyond), I’m sure that tech legislation will be one of the most significant points of contention.

NEXT UP

Continue reading about Big Tech...

This article is part of the Morning Coffee collection. This section of the blog features articles to accompany your morning dose of caffeine, quick and easy.

Harry Austen is a Data & Search Analyst. He has worked with the likes of Disney, The Olympics and Zoopla. @austenharry  

This page may change periodically, as and when more information becomes available. Therefore, please remain patient if you are experiencing longer load times than normal. As this article is subject the above, please also be patient for immediate updates.  The Blog | Subscribe

Categories
Morning Coffee

Quality Assurance Testing for Analytics

Quality Assurance Testing for Analytics

This is a guide to identifying groups and audiences, with better alignment with content and user intent. Equally, this methodology ensures focus on attainable keywords and helps in discovering otherwise missed keyword clusters. With a logical approach, you can explain the keyword process you take to both clients and stakeholders with more clarity.

Thumbnail for Quality Assurance article, showing engines at work.
If you like and die by the results you see show up in your analytics — they best be right. Harry Austen via Unplash
Harry Austen

by Harry Austen

Published Nov. 22, 2020 GMT

Updated Mar. 6, 2021 GMT

One of the fundamentals is often missed when it comes to quality assurance. This is the reasoning for completing the task in the first place: ensuring that your data house is in order. In other words, what you see in your Google Analytics is reflective of what you expect to see. This is, of course, relative to the size and popularity of your website but all in you should have a strong idea of how many visits, sessions and other metrics of that nature, you expect to see show up in your account. It’s always best to ensure that tracking and implementation are set up correctly. Otherwise, the accuracy of the data being pulled into Adobe or Google Analytics could come in question. This is where Quality Assurance testing comes in. This is a simple test to see that your analytics implementations have gone to plan. Generally speaking, they should not entail a lot of technical know-how, as long as the plan (which I will layout shortly) is precisely explained.

Before we jump into the test case, it is worth explaining the difference between two terms that may overlap during this article. Those terms are quality assurance and quality control. Ostensibly, these may appear to sound and act in the same way, and while that may not be entirely untrue, they serve as different terms in this context. Quality Assurance, in it’s simplest of terms, is the test we will be undertaking to review the output of our data. Quality Control is a term that involves a more thorough look under the hood, reviewing the engine in more detail. Generally, a quality control review would occur after identifying any issues during the QA (Quality Assurance) test. It’s worth noting that this isn’t always the cause of action, but it’s the route I take when testing and modifying implementations.

Below is a simple framework to keep in mind when testing data quality, to make things easier when following along. This framework will go a long way when reviewing the effectiveness of your implementations and tests. The framework is the lifecycle (steps) to ensuring data quality.

1. Define the business process, data rules, data goal and stakeholders involved.
2. Measure existing data in-line with the above data rules and subsequent goals.
3. Analyse existing data against the data quality goal. (Generally, this takes the form of gap analysis.)
4. Based on the Analyse (3) findings, improve, with a new initiative to solve the issue(s) raised.
5. Implement business and technical solutions device at the Improve (4) stage.
6. Control (and measure) the data to assess non-biased implementation, in-line with the data rules, goals.

At the final stage (6) the data should then be in a good place to monitor. In other words, this should be included in any business reporting or dashboard to review daily or weekly for updates. This will form the basis for future data initiatives, so tracking is essential to record progress and mitigate errors.

A test plan is usually undertaken right before an element, page or template goes live on a website. I wanted to include it in here as it’s an entirely relevant sub-step to quality assurance. Consisting of two components, the final plan isn’t quite as taxing as a full QA plan. With that being said, this should not replace any QA work you are doing. The two have different purposes but ultimately help with ensuring accurate web and data functionality.

The linked scorecard case study has more detail on the terms above if you need more clarity on the definitions and use cases. A test plan should only concern one section, page or template. This isn’t a site-wide test, but, in other words, more akin to an A/B test of sorts. The first step is to list a series of interactions a user can make on a given page or element. Then, for stage two, list the data you expect to see from said interactions. This could be metrics as basic as hits (sessions, clicks, etc.), listed individually against the aforementioned actions a user takes. A thought to keep in mind is that this process should instil trust in the data you are working with.

If you wanted to add more detail to the test plan, you could also add in stakeholders responsible, method of delivery and tooling. However, for this example, this is where it ends. And while that last step isn’t being done here, it does not make it unimportant. With that being said, from the examples of test cases and research here, you should have an idea of how far you would like to take the evaluation.

This article is part of the Morning Coffee collection. This section of the blog features articles to accompany your morning dose of caffeine, quick and easy.

Harry Austen is a Data & Search Analyst. He has worked with the likes of Disney, The Olympics and Zoopla. @austenharry  

This page may change periodically, as and when more information becomes available. Therefore, please remain patient if you are experiencing longer load times than normal. As this article is subject the above, please also be patient for immediate updates.  The Blog | Subscribe

Categories
Morning Coffee

Wireframing: The Case for SEO

Website Wireframe: The Case for SEO

With ‘Page Experience’ now an official ranking factor, it is time to start working on our understanding of SEO wireframing. The once UX specific term has crossed over to the world of SEO, and the name means a lot more than just a good physical design. This article explains what UX means in SEO terms and how to apply wireframes with real-world examples.

Mass of people crossing the street in Tokyo, used to highlight Wireframing Thumbnail Artwork.
As wireframing and other aspects of UX creep into Search, amid algorithm updates favouring the topic, it's key to get a solid understanding of its effect.
Harry Austen via Unplash
Harry Austen

by Harry Austen

Published Dec. 16, 2020 GMT

Updated Mar. 6, 2021 GMT

The term, ‘wireframe’, is generally regarded as UX tasks ahead of building a new product, service or website. It is the process of building essentially a blueprint of what the finished article looks like (literally, in this circumstance). Several excellent articles cover the topic of website wireframes already, but here is one that inspired this piece via the fantastic Matt Wright.

As announced in a recent Google update, ‘Page Experience’ is now an official ranking factor. I do not think this comes as the biggest surprise ever to folks in the industry. We’ve known, or rather, assimilated and assumed, that users behaviour and interaction play a significant role in Search. Those assumptions are correct. What it means in a broader context is that people are starting to view site design differently. Particularly as it pertains SEO — which is a fascinating place to be. More importantly, users and user-agents (Google bots and the-like) are no longer willing to accept low interaction and poor navigation. Gone of the days are sidebar overloads and pixilated, tough to read text and imagery. Webmasters need to be more strategic in their approach – not just looking at the way the site looks but also how it runs.

Website Wireframe Case Studies

SEO Wireframe UX Template
General SEO Wireframe layout.

1. Menu/Navigation – The navigation bar is simple, with critical pages you are looking to target.
2. Multimedia – Video or imagery showing off your business or service, with alt-tags to help indicate to Google what they mean.
3. Trust signals – In the form of social links to your business, these links provide evidence of your online presence as a brand.
4. Content block – A snippet of content or feature on the product page or item from your blog to add context to the page, further inform Google.
5. FAQs – Frequently asked questions from customers, answered on first viewing of the site.
6. Geo – Basic location information, ‘where to find us’ page, to help customers find your business or at least find out more.

The above is a concrete layout of the type of page template that works for SEO. The elements look relatively simple (because they are). The inner designer in all us tends to overcomplicate layouts. The basic grid layout is a classic UX-focused approach that helps lock in elements that are guaranteed to improve a website’s experience for a user.

Product SEO Wireframe UX Template
Product SEO Wireframe layout.

1. Breadcrumbs – Adds hierarchy of navigation, what path the user has taken to get to this page; helping both physical users and bots better move around and understand your site.
2. Featured product image (with alt-tags) – Add pictures to show off your product, preferably with al-tags to help bots understand to get the image’s meaning.
3. Reviews – In a similar way to social links, this section acts as trust signals as it proves that users have previously purchased items from the site and, hopefully, enjoyed them.
4. Media – Add additional images to show-off your product. This should include as many photos as possible.
5. Videos also work tremendously well if the product is better explained via a video walk-through.
6. Product Description – This gives further context to the imagery and will again help provide more information to user agents (bots) crawling the page.
7. Alternative products – To ensure you are enhancing the user experience, add more products (that are similar to the ones you are currently displaying) to make it easy for a user to jump between products.

The above is a strong layout of the type of page template that works for a Product Page. Although I have included fewer elements, this template allows you to take advantage of schema markup. There are clearly some other personal additions to make, but you can see how easy it is to make a page much more accessible for a crawler with simple navigation changes. Ultimately, the experience a user goes through will be the defining factor of years to come.

NEXT UP

Continue reading about website wireframe case studies...

This article is part of the Morning Coffee collection. This section of the blog features articles to accompany your morning dose of caffeine, quick and easy.

Harry Austen is a Data & Search Analyst. He has worked with the likes of Disney, The Olympics and Zoopla. @austenharry  

This page may change periodically, as and when more information becomes available. Therefore, please remain patient if you are experiencing longer load times than normal. As this article is subject the above, please also be patient for immediate updates.  The Blog | Subscribe

Categories
Morning Coffee

Outbound Link Attribution Model

Outbound Links, Attribution Model

This piece reflects on a project I worked on that involved building a model that reviewed correlations between our outbound link profile and weighted metrics known to correlate with better search results. The link attribution was later applied to forecast several campaigns and initiatives.

Outbound Link Attribution
Ahrefs data isn't the most reliable. While the acting search agent crawls hundreds of millions of pages, they may not always be the one's you're after.
Harry Austen via Unplash
Harry Austen

by Harry Austen

Published Dec. 16, 2020 GMT

Updated Mar. 6, 2021 GMT

The reason for building an intricate model, theoretically attempting to figure out how Google ranks based on links, starts with Ahrefs. And those damn links. When you are working for a small site with a mid-level of engagement in terms of links – Ahrefs, SEMRush and other web scrapers are incredibly valuable as a measurement on current backlink performance. When you work for a client with over 20M links, it’s a different story.

The reality of Ahreh’s crawl quality and the coverage, or lack thereof, meant for constant discrepancies with link data. It’s worth noting that this isn’t entirely Arefs’ fault. I couldn’t imagine the amount of planning that goes into a daily crawl of links on their end and for this client in particular, links are a bit of an anomaly to start with. With the metric underpinning a reasonably large chunk of importance, accurate reporting is critical to understanding and forecasting SEO long-term. There have also been reports of Ahrefs fudging the numbers for analytics. Again, this is not a true reflection on their output, as they offer very good SEO services. But it’s always an idea to take their, and other services for that matter, with a grain of salt.

Ahrefs Link discrepancy
This graph shows the error level of links requested via the Ahrefs API.

In a task that I wouldn’t recommend to anyone, I went through our entire LRDs (outbound link profile) backlog to manually spot issues ahead of running. One-by-one, for just over 4k LRDs, was a delicate experience. The silver lining of completing such a mammoth task was that a) I got a first-hand look at how egregious the links Ahrefs collected were, and b) I completed about four years worth of disavowing in one week. Equally, the web and IP disavows were fitted as our exclusions. Due to the size and the fact that we wanted to target both web-addresses and IP-addresses, we were forced to make these exclusions in waves of three. In other words, three scenarios in which we would have a chance to add additional exclusions. These scenarios would be in either R itself, BigQuery and (eventually) DataStudio when we visualised the results. We wanted to be very cautious on our pull limits (the amount of data we wanted request) – 20M ain’t cheap. And this was a good way to limit the damage. No-follow links were also excluded from our list, seeing as they didn’t fit our SEO-focused criterion.

Once we had a cleaned-up set of links, it was time to run them through the model to test our Complex Hypothesis. Included in the list of theories to test were both SEO myths and truths; testing the likes strength of Domain Authority, Page Authority, LTD, number of total external links, language, and everything under the SEO sun. After consideration, we decided the simplest way to express a link’s ‘true value’ was a score out of ten.

As well as weighting metrics that are known to correlate with better search results, we also added damping factors such as (aforementioned) no-follows and freshness (date published). These damping factors are known as coefficients. In technical terms, for the statisticians reading, this model was both a ‘Coefficient of Variation’ and ‘Relative Standard Deviation’. There are several excellent guides, but I’ve linked the ones (above) that I found useful when explaining the concept.

Once we ran through the model (which I can’t share for reasons you already know), we started to see results in the form of signals. For example, initially, we hypothesised that Domain Rating was the largest factor and weighted it accordingly. However, the result wasn’t the case – as was the same false prediction with several other metrics. So we adjusted the weighting. Again and again, until we got to a clear conclusion and, ultimately according to our data, a clear idea to which ranking factors had the most authority when it comes to links. Once standardised, this was then fitted within our weekly reporting moving forward.

NEXT UP

Continue reading about Outbound links...

This article is part of the Morning Coffee collection. This section of the blog features articles to accompany your morning dose of caffeine, quick and easy.

Harry Austen is a Data & Search Analyst. He has worked with the likes of Disney, The Olympics and Zoopla. @austenharry  

This page may change periodically, as and when more information becomes available. Therefore, please remain patient if you are experiencing longer load times than normal. As this article is subject the above, please also be patient for immediate updates.  The Blog | Subscribe