It can be daunting to decide what to do to give your news article the best chance of performing in Google. Here I look at the seven most valuable optimisation elements.
I agree with almost everything here as a starting point! I would definitely add in "subheads within articles where appropriate" because those are great for users and google to orient themselves.
I was under the impression that vector embeddings/Word2Vec were no longer in use since the incorporation of transformers into the ML model, but it doesn't really change the fundamental instructions for journalists.
I don’t believe transformers tech has been implemented in search yet, outside of AI Overviews. It’s too expensive to do that for every query (an AIO takes 6x as much energy to generate than a regular result) and the nature of LLMs doesn’t lend itself well to the flexibility that ranking factors require.
As far as I understand it, BERT is used to understand queries better (especially ones Google hasn't seen before, so has no click-data to guide its rankings). It's not used in building the index as it's a transformers-based tech and thus probably not suitable for language analysis at scale.
That really depends on what you want to achieve. Evergreen articles should stay on the same URL and have updated contents, timestamps, and even images. News articles have an entirely different purpose and updating them tends to happen in the course of a story's news cycle.
Strongly agree! BTW, I think the summary or intro sentence is worth applying on every informative piece of content, including blogposts.
Side note: one can be creative and experiment with the concept to the point you can turn it into whatever you see fits your page template the best, even if it's not a sentence anymore.
I agree with almost everything here as a starting point! I would definitely add in "subheads within articles where appropriate" because those are great for users and google to orient themselves.
I was under the impression that vector embeddings/Word2Vec were no longer in use since the incorporation of transformers into the ML model, but it doesn't really change the fundamental instructions for journalists.
I don’t believe transformers tech has been implemented in search yet, outside of AI Overviews. It’s too expensive to do that for every query (an AIO takes 6x as much energy to generate than a regular result) and the nature of LLMs doesn’t lend itself well to the flexibility that ranking factors require.
thanks! I was thinking Word2Vec was replaced with BERT, but not sure where that clocks in for news SEO.
As far as I understand it, BERT is used to understand queries better (especially ones Google hasn't seen before, so has no click-data to guide its rankings). It's not used in building the index as it's a transformers-based tech and thus probably not suitable for language analysis at scale.
Of you were to update an old article for Google, how would you go about updating it?
That really depends on what you want to achieve. Evergreen articles should stay on the same URL and have updated contents, timestamps, and even images. News articles have an entirely different purpose and updating them tends to happen in the course of a story's news cycle.
Strongly agree! BTW, I think the summary or intro sentence is worth applying on every informative piece of content, including blogposts.
Side note: one can be creative and experiment with the concept to the point you can turn it into whatever you see fits your page template the best, even if it's not a sentence anymore.