Debunking Myths: Can AI Content Actually Rank on Google?

Google's stance on AI-generated content is pivotal, especially as search engines evolve to prioritise quality over quantity. Understanding the intricacies of Google's expectations—particularly the concepts of E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness)—is essential and empowering for anyone looking to harness AI effectively in their content strategy. These guiding principles affect SEO rankings and determine how well an audience receives the material.

This article seeks to debunk prevailing myths surrounding AI-generated content, explore its advantages and challenges, and provide valuable insights on ensuring that AI text can achieve desirable SEO results. By examining key factors influencing rankings and best practices for content creation, we aim to equip you with valuable knowledge to navigate the intersection of AI and digital marketing successfully. 

Google's Stance on AI-generated Content

Google's stance on AI-generated content is pragmatic and transparent. As long as AI-generated articles adhere to the platform's guidelines for helpful content, ranking in search results is not restricted. Reassuring us that Google's primary focus remains on the quality of content, its relevance, and user intent rather than the creation method, opening up new avenues for digital marketing strategies.

 Here are key factors that influence search engine rankings:

 Quality and Relevance: Content must be high-quality, relevant to user queries, and demonstrate expertise, authoritativeness, and trustworthiness, regardless of whether AI or human writers generate it.

Value to Users: Content must provide genuine value to users. Google emphasises that valuable content is essential to achieve high rankings. This 'value' is often determined by the user's intent, which refers to the specific information or solution they seek when they search.

Policy Compliance: AI content must adhere to Google's policies. Content that doesn't, such as low-quality or misleading content that is overly promotional or contains deceptive information, may be penalised or removed. This underscores the importance of maintaining the quality and relevance of AI-generated content.

Google evaluates content based on its substance rather than its origin. This means that AI-generated and human-generated content can rank successfully if they maintain the high standards the search engine giant expects. Regardless of the content's creation method, it signals that quality is key.

Importance of Quality in Content

In the vast content industry, content quality reigns supreme. Google's search algorithms prioritise high-quality, relevant content over mere content for search engines, underscoring that success in search engine rankings depends on content quality rather than the content creation process. This underscores the weight of your responsibility to create valuable content.

High-quality content must meet user expectations by being both informative and engaging. It is not just about stuffing keywords to game the system but crafting an article that speaks directly to user intent. Originality is another cornerstone of valuable content. Plagiarised or copied material fails to stand out and risks penalties from Google, which vigilantly champions unique and original insights.

Establishing authority elevates the content's rank significantly. You can achieve this by leveraging credible sources and incorporating thorough research. Moreover, engagement techniques—such as images, lists, and short paragraphs—are essential elements of a high-quality article to maintain reader interest. These techniques enhance user experience and signal Google that the content is worth elevating.

Understanding E-E-A-T

E-E-A-T, which stands for Experience, Expertise, Authoritativeness, and Trustworthiness, is a fundamental framework Google uses to assess content quality. This framework ensures that users receive reliable and credible information, elevating the standards for the content creation process. 

- Experience: Demonstrating practical knowledge through direct observation assures readers of the authenticity of the content.

- Expertise: The proficiency gained from training and experience allows content creators to provide deep insights, enhancing content's helpfulness.

- Authoritativeness: As a key source of accurate information, authoritative content garners user trust, boosting confidence in the material shared.

- Trustworthiness: Ethical information sourcing and transparent communication are imperative for building user trust and Google's confidence in the content.

By adhering to these principles, content generators—AI or human writers alike—can craft material that aligns with E-E-A-T and resonates with users and search engines.

Experience and Expertise

The experience used in content generation involves showcasing practical knowledge or conducting direct observations within a specific field. This approach guarantees users receive reliable and authentic content, further reinforcing their trust in the material provided.

Expertise, on the other hand, embodies the skills and proficiency honed through training and experience. This depth enables content creators to share profound insights that enhance the relevant content for the audience.

Embracing Google's E-E-A-T principles is crucial for content that aligns with user satisfaction and significantly boosts search engine rankings. Whether AI-generated or crafted by human minds, content must provide value and assistance. Therefore, emphasising originality, quality, and relevance is non-negotiable for content that wants to make its mark in search results.

Authoritativeness and Trustworthiness

Authoritativeness in content creation is not merely a preference but crucial for achieving higher search rankings. It is established through reliable data, expert opinions, and robust statistics, which collectively enhance the credibility of the content. Responsible content generation meets Google's standards and resonates more effectively with users.

Citing sources correctly plays a pivotal role in enhancing an article's trustworthiness. It aids in establishing a solid foundation of authority and reliability, which is vital for SEO success. High-quality content bolstered by authoritative references signals to Google that it deserves higher recognition in search results.

The E-E-A-T principles emphasise the importance of Experience, Expertise, Authoritativeness, and Trustworthiness in content that serves user needs and excels in search rankings. Clear communication and ethical information gathering are of utmost importance for building user trust, a sentiment that will invariably influence Google's confidence in both the content and its creator. Following these guidelines ensures content that ranks well and truly fulfils its purpose of informing and engaging.

Key Factors Influencing SEO Rankings

In the ever-evolving world of SEO, understanding what underpins the rankings on Google is crucial for content creators aiming to enhance their visibility. It's not just about creating great content but ensuring that it aligns with the ranking factors that Google prioritises. Here's a breakdown of some key factors influencing SEO rankings:

  1. Quality of Content: The quality and relevance of your content are paramount. High-quality, informative, engaging content is more likely to achieve higher rankings on Google's search engine results pages (SERPs).
  2. Backlinks: Acquiring high-quality backlinks from relevant websites can significantly boost your visibility. The more authoritative the sources linking to your content, the better your chances of climbing the SERP ladder.
  3. Keyword Optimisation: Effective keyword placement in title tags, meta tags, and the first 100 words of your content remains vital. This helps grab user attention and tell search engines what your page is about.
  4. User Engagement Metrics: Metrics like time-on-site and bounce rates provide Google with data that can influence rankings. Content that keeps users engaged is more likely to rank highly.
  5. Content Strategy Optimisation: Due to advancements in artificial intelligence, ranking factors and algorithms continue to evolve, so content creators must optimise and refine their strategies to stay competitive consistently.

Relevance of Content

Content relevance is a linchpin in achieving positive search engine rankings. Here's why:

- High-Quality Output: Google's guidelines stress the importance of generating quality content regardless of production method. Whether generated by humans or AI, the outcome must be top-notch.

- AI Content: Feeding AI tools with comprehensive and accurate data sets can boost the relevance and credibility of AI-generated content, making it suitable for Google's standards.

- User Intent: Content must be designed to meet user intent. Addressing what users seek with particular keywords makes your content more likely to resonate with audiences.

Informative and Engaging: Besides being SEO-friendly, content must be people-friendly. This means being clear, concise, and ultimately valuable to ensure engagement and satisfaction.

User Engagement Metrics

User engagement metrics are critical for determining the success of your content in Google's eyes. Here's how they work:

- Time on Site: When users spend more time on a page, it signals to Google that the content is valuable, which can result in higher search rankings.

- Social Sharing: Engaged users tend to share social media content, boosting visibility and credibility.

- Low Bounce Rates: Content that encourages interaction, such as through comments or shares, tends to have lower bounce rates. Google interprets this as a sign of quality.

- Overall Effectiveness: Google evaluates a page's effectiveness based on content quality and user engagement, underscoring the need for content that captivates and retains audience interest.

The intersection of quality and engagement metrics forms a strong foundation for ranking well on Google. Focusing on these factors can propel your content to new heights in search engine rankings, whether through the creative human touch or the precision of AI-generated content.

Crafting High-ranking AI-generated Texts

In the evolving digital landscape, the role of AI in content creation has become increasingly prominent. Despite initial scepticism, it's clear that AI-generated content can indeed rank on Google, provided it meets high quality and relevance standards. Search engines like Google's primary focus remains on the quality of content rather than its origin. Therefore, content creators and generators must prioritise producing valuable content that effectively addresses User Intent.

A strategic approach combining AI efficiency with human creativity is essential to ensuring that AI-written articles rank well. This blend is particularly effective, as it has been shown to improve conversion rates dramatically. Content creation should incorporate rigorous SEO techniques, leveraging SEO tools for keyword research to identify and seamlessly integrate relevant terms throughout the text.

Additionally, periodic reviews and updates of AI-generated content are crucial. Human oversight enhances content quality and aligns it with brand messaging, thus reinforcing its potential to secure better search engine rankings. Collaboration between human editors and AI can significantly improve the synergy between technology and human insight, ensuring that the content industry produces engaging and informative articles.

Importance of Human Editorial Oversight

Human involvement is indispensable in the creation of AI content. Human writers and editors ensure that AI-generated content adheres to quality standards. This oversight includes checking the accuracy of the information and refining the text to add a layer of polish that technology alone cannot achieve.

To prevent misinformation, subject matter experts (SMEs) are vital in verifying the content's reliability, particularly in sensitive or specialised fields. They are essential in maintaining a consistent voice across all communications, which helps reinforce a brand's identity and makes the content easily recognisable.

Moreover, human oversight ensures that the AI-generated content's nuances, tone, and factual accuracy align with the brand's ethos. This alignment is critical for achieving high search engine results page (SERPs) rankings and retaining the content's helpful nature, thereby providing readers with meaningful interactions.

Understanding Search Intent

Grasping search intent is a cornerstone of creating content that truly resonates with audiences. It refers to the underlying reason behind a user's query, and understanding this is crucial for crafting relevant content that meets user needs. Generally, search intent can be classified into four categories: informational, navigational, commercial, and transactional. Each category addresses different purposes of a user's search.

Comprehension of search intent allows content creators to tailor their work to answer users' questions and provide substantial value directly. Ensuring a piece of content aligns with the user's search intent significantly enhances its relevance, making it more likely to perform well on platforms like Google.

Aligning content closely with search intent ensures higher engagement and plays a pivotal role in achieving better rankings. Thus, for any AI-generated blog or article, thoroughly understanding and integrating search intent is integral to elevating the content's impact and helping it rise in the content rank.

Common Myths About AI Content

In the evolving landscape of the content industry, misconceptions about artificial intelligence and its role in content creation are widespread. Addressing these myths is crucial for content creators looking to leverage AI while ensuring high content quality and relevance. Here, we debunk three common myths about AI-generated content and its potential ranking on Google, emphasising the need for responsible content creation practices.

Myth 1: AI Content Can not Rank

Contrary to popular belief, AI content can indeed rank on Google. The key is to create high-quality, relevant content that aligns with user intent and search engine guidelines.

Google ranks AI content as long as it meets the standards of Expertise, Authoritativeness, and Trustworthiness (E-A-T). Incorporating human review is essential to ensuring the content's quality, accuracy, and originality.

Overreliance on AI without human oversight might lead to compliance issues, potentially affecting ranking. To maximise the chances of AI content ranking well, it must be adequately humanised and optimised for SEO.

Myth 2: All AI Content is Low Quality

The assumption that AI-generated content is inherently low-quality is false. The quality of content, whether AI-generated or human-written, depends on its use. AI can responsibly produce high-quality content, serving as a valuable tool for content creators.

Combining human insight with AI efficiency can result in informative, engaging, and valuable content that ranks well on Google. AI supports the content creation process, enhancing efficiency and analysis capabilities while maintaining the credibility and relevance of the content through human research and editing.

AI-generated content should be checked for originality, accuracy, and adherence to ethical standards to maintain high quality.

Myth 3: AI Content Can't Meet SEO Standards

Google does not ban AI content. It prioritises high-quality and relevant content, regardless of its source. The quality of AI-generated content significantly impacts SEO. Whether well-optimised or poor-quality, the content can affect a webpage's ranking.

Google's sophisticated algorithms are adept at analysing patterns and characteristics associated with AI-generated blogs. This capability enables it to promote original, relevant, and keyword-optimised content. Although AI can assist in writing, human expertise is indispensable to ensure that the content meets SEO standards and aligns with E-A-T guidelines.

Integrating AI in content creation is beneficial when supported by human oversight. It provides a seamless blend of technology and authenticity.

Dispelling myths about AI content is pivotal to harnessing its potential in the content industry. By ensuring content aligns with Google's quality standards and is infused with human expertise, AI-generated content can thrive in search engine rankings.

Advantages of AI in Content Creation

Integrating artificial intelligence into content generation has become a game-changer in the ever-evolving content industry. AI-generated content holds immense potential to transform the creation process, offering several advantages that significantly benefit content creators and marketers.

Efficiency and Speed

One of the most remarkable benefits of AI in content creation is its ability to enhance efficiency and speed. AI-generated content can be produced quickly and in large quantities, allowing content creators to maintain a steady flow of valuable content to meet audience demand. This automation saves time and conserves resources, enabling businesses to direct their efforts towards other strategic initiatives.

AI tools are designed to produce content that adheres to high-quality standards without errors, fatigue, or bias. This capability aids in improving the overall content quality and consistency. Furthermore, AI's proficiency in analysing keywords and trends enhances SEO optimisation, ensuring the content remains relevant to search engine rankings.

However, it is crucial to blend this automation with human oversight. By adding a human touch to AI-generated content, creators can maintain the emotional nuance that resonates with audiences.

Scalability of Content Production

Another significant advantage of AI is its scalability in content production. Businesses can leverage AI to produce high-quality and relevant content on a larger scale, essential to addressing the diverse needs of various sectors and audience types. AI's ability to generate large volumes of text, images, audio, and video enables businesses to maintain a consistent web presence across multiple platforms.

By combining AI's content generation prowess with human SEO expertise, companies can optimise their content for search engine visibility. This optimisation improves discoverability and reach, enhancing the effectiveness of scalable content strategies. As the demand for quality, impactful content grows, AI stands out as a valuable resource that can help businesses efficiently achieve their content production goals.

In summary, when AI-generated content adheres to Google's quality guidelines, it has the potential to rank well in search results, seamlessly meeting SEO criteria and user intent. This makes AI a formidable ally in the quest to produce helpful and engaging content that captivates audiences and bolsters search engine rankings.

Challenges of Using AI-generated Content

In the ever-evolving content industry, artificial intelligence (AI) has made significant strides, revolutionising the content creation process. However, as with any transformative technology, AI has inherent challenges that content creators and marketers must navigate.

Addressing the issues of originality, quality, and human oversight is crucial when discussing AI-generated content. These challenges directly impact the content's rank on search engines like Google, underlining the importance of human-generated insight in content creation.

Potential for Plagiarism and Duplicate Content

AI-generated content often faces significant scrutiny for its potential to produce plagiarised or duplicate material. This is particularly concerning for search engine rankings, as search engines prioritise the quality of content. Duplicate content can seriously harm a webpage's rank.

 Google's algorithms identify duplicated content and penalise sites by lowering their rankings. It's paramount to ensure that AI-generated content is thoroughly checked for plagiarism. Failure to do so can lead to substantial search engine penalties, diminishing the content's visibility and effectiveness.

Content creators should employ robust plagiarism detection tools to mitigate these risks and manually review AI outputs for originality. This dual approach helps identify and correct content that is spun or barely altered from existing sources, enhancing the overall quality of the content.

Limitations in Creativity and Nuance

While AI has shown adeptness at generating technically accurate content, it struggles with infusing creativity and the subtle nuances crucial in engaging an audience. AI's creative capabilities are limited by its dependence on pre-established programming, which lacks the emotional intelligence to convey feelings or situational context. This often results in content that, although accurate, might fail to resonate with readers on a deeper level.

Human writers excel at weaving creativity and emotional intelligence into content, crafting relatable and engaging pieces. Conversely, AI necessitates human intervention to review and edit its work for depth and insight. This is where human editors become indispensable. By ensuring that AI-generated content is reviewed for tone, context, and nuance, content creators can enhance the value and relevance of their work, thereby aligning it more closely with user intent and boosting its search engine rankings.

In summary, even as AI plays a growing role in content generation, it's imperative to acknowledge and address its limitations. Combining AI efficiency with human creativity and oversight allows content creators to produce valuable, original content that meets search engine quality standards and satisfies user intent.

Adhering to Google's Quality Guidelines

In the rapidly evolving digital age, Google remains a crucial gatekeeper of information, and understanding its quality guidelines is essential for anyone involved in content creation. Whether content is crafted by human writers or generated by artificial intelligence, Google makes no distinction—what matters is the substance and relevance of the content. The quality of content is paramount, as Google prioritises valuable and user-centred material over pages filled with meaningless keywords.

Originality is at the crux of Google's standards. Content must be unique and not merely a rehashed version of existing materials. This is where AI-generated articles can shine if executed thoughtfully. By harnessing the power of AI, content creators can produce fresh and engaging pieces that respond directly to User Intent, effectively addressing search queries and enhancing visibility.

Relevance and user experience are equally significant in search engine rankings. Content must meet the user's needs while ensuring a visually appealing and navigable experience. Factors like E-A-T (Expertise, Authoritativeness, Trustworthiness) are critical for credible AI and human-generated content. Adhering to these principles ensures your content aligns with Google's standards and enhances its ranking potential.

Best Practices for AI Content

Creating AI-generated content that resonates and ranks well on Google is an art that demands attention to detail and strategic foresight. To elevate your AI content, focus on leveraging structured data. By marking up your content with schema, search engines better understand your page, which can significantly boost your rankings. This practice is not just about feeding information to search engines but ensuring it is presented in a way that enhances user engagement.

Refinement is crucial. Use tools like Grammarly, ProWritingAid, and the Hemingway App to polish your AI-generated articles. These tools help eliminate typos, address grammatical issues, and ensure readability, all contributing to a positive user experience.

Furthermore, optimising AI content post-publication is vital. Generating high-quality backlinks is a powerful tactic for improving search rankings. Combine this with strategies that enhance user experience and clever keyword placement, and you can effectively align your AI-generated content with Google's algorithms.

Regular Auditing and Updating of Content

Consistency and relevance are keys to maintaining high-quality content in Google's eyes. Regular auditing and updating are pivotal in ensuring your content stays fresh and pertinent. Google's algorithm values content that keeps pace with current trends and events, influencing rankings positively. This process is essential for content generators that produce valuable and helpful content.

Content audits should prioritise quality and accuracy, ensuring details remain superior and informative. Improving user experience, including loading speed and mobile-friendliness, is crucial during these audits. Furthermore, engaging with users through interactive elements such as polls or calls to action can significantly boost engagement metrics and relevance.

Content creators can enhance the visibility of AI-generated blog posts by incorporating effective SEO strategies during updates.

This proactive approach keeps content appealing to search engines and users, positioning you at the forefront of the content industry. Regular updates and audits reinforce your commitment to providing helpful, relevant content that stands out in the digital landscape.

Future Outlook: AI Content in SEO

As the digital landscape evolves, artificial intelligence increasingly plays a pivotal role in content creation. But how does this translate to search engine rankings? Google, always at the forefront of refining search algorithms, emphasises the quality and relevance of content. To rank effectively, AI-generated content must meet Google's E-A-T (Expertise, Authoritativeness, Trustworthiness) guidelines.

AI offers immense potential in the content industry by enhancing SEO with optimised keywords and streamlined structures. However, this content mustn't be spammy or low-quality, which could negatively impact search engine rankings.

Advantages of AI Content in SEO:

Benefit Description
Efficiency Large volumes of quality content are created quickly.
SEO Optimisation Automated keyword and structural enhancements.
Consistent Uniform tone and style across multiple pieces.

However, human involvement remains critical. The collaborative content generation process, which blends AI with human expertise, ensures the generation of valuable content that genuinely aligns with user intent. Ultimately, the future of search engine content will hinge on how well AI and human writers work together to create relevant, helpful content that resonates with users and adheres to search engine standards.

How to write sales copy for results.

Crafting Compelling Sales Copy for Stellar Results

Writing compelling sales copy can be the difference between a potential customer scrolling past your offer and taking action. It’s an art and a science that combines persuasive techniques with an understanding of your audience. Here are some key strategies to help you create sales copy that drives results:

1. Understand Your Audience

Before you even begin writing, understand who you're speaking to. Research your audience’s demographics, needs, pain points, and desires. Tailoring your message to resonate with them will increase engagement and encourage them to act.

2. Start with a Strong Headline

Your headline is the first impression, so it needs to grab attention immediately. Use powerful words, ask questions, or present a compelling benefit to draw readers in. A great headline can significantly increase your open and click-through rates.

3. Focus on Benefits, Not Features

People are more interested in how a product or service will improve their lives rather than just its features. Outline clear benefits that address the problems your audience faces. Show them how your offering can make their lives easier, better, or more enjoyable.

4. Build Trust with Social Proof

Incorporate testimonials, reviews, or case studies to build credibility. When potential customers see that others have had positive experiences, they’re more likely to trust your brand and make a purchase. Include real names and photos (with permission) to enhance authenticity.

5. Create a Sense of Urgency

Encourage potential buyers to act quickly by creating a sense of urgency. Phrases like “limited time offer” or “while supplies last” can prompt immediate action. Just be sure not to overuse this tactic — it should feel genuine and not forced.

6. Use Clear and Concise Language

Keep your language simple. Avoid jargon that could confuse your audience. Clear, concise copy is easier to read and understand, leading to higher conversion rates.

7. End with a Strong Call to Action (CTA)

Your sales copy should always guide readers toward a specific action. Whether you want them to sign up for a newsletter, purchase, or download a free resource, your CTA must be clear and compelling. Use action-oriented language to motivate them, such as “Buy Now,” “Start Your Free Trial,” or “Join Our Community.”

8. Edit and Revise

Don’t underestimate the power of editing. Take a break after writing and return to it with fresh eyes to catch errors and improve clarity. Consider using online tools or getting peer feedback to refine your copy further.

You've Got This.

Writing compelling sales copy is not just about making a sale; it’s about connecting with your audience and providing value. Understanding your readers, focusing on benefits, and crafting a clear CTA will increase your chances of generating leads and closing sales. Keep practising and refining your approach; you’ll see positive results soon!

Cricket & the Ball Tampering Episode in South Africa.

This has been an absolute beat-up but the Oh so perfect media.

Lets try to put some perspective into this.

Ball tampering is seen as cheating.  Which it is but some level of ball tampering is permitted while still not strictly legal under the rules of cricket.

Polishing only one side of a ball is tampering under the definition of altering a balls condition to affect it's aerodynamic charcteristics.  Returning the ball to the keeper on the bounce to rough up the surface is also ball tampering.

These are condoned and accepted practices despite falling outside the strict letter of the law.

Under the rules of cricket the penalty for ball tampering is the replacement of the ball and 5 runs added to the batting teams total.  If the bowler is repeatedly tampering with the ball he may be suspended from bowling for the remainder if the innings and further sanctions may be laid at the end of the match.

That's it, no 12 month suspension, no sackings, no public disgracing by the totally squeeky clean never done anything wrong even on our tax returns, travel claims or shopping online while at work media.

In the past the English team, who later actually admitted to ball tampering during an Ashes series against Australia, got Knighted for the results of their cheating.

South African fans have clearly forgotten the numerous incidents of ball tampering by their own team even as recently as 2014.

India and Pakistan have their own incidents to reflect on.  Not all teams have been caught, that doesn't mean they haven't broken rules, just that they haven't been caught doing so.

See this link at the bottom of the page for the references.

While it is disappointing to see our players behaving badly on and off the field they are still young men who want to perform well for their country in their chosen sport.

Sometimes emotions get in the way of good decision making and I can attest to not always playing sport in the correct spirit of the game on occasion.

In case you have forgotten, this is entertainment not life and death.  It really doesn't matter.

The confected outrage from media and fans alike should be treated for what it is BS.

I don't know of anyone in any position Government, Private Industry, Public Office, Services of any sort, including the Churches and any religion who hasn't done something in their life which is not in the bounds of the strict sense of the laws and rules by which we all live.

Perhaps the Dali Lama but we should probably ask him as he may dissuade us of that thought.

It may be small or it may be large but we have all overstepped the rules at some stage.

So, since this is Easter let me just say that he who is without sin should cast the first stone.

Special post as a warning.

Hello to all who read this.

This post is not one of my normal Internet Marketing or progress reports on the program but a warning about using power tools.

This weekend I was working with my wife and eldest son at his house clearing up the aftermath of the recent storms.

Log splitter

One of his trees had to be cut down and we were splitting the wood with a hydraulic wood splitter.  These are amazingly effective at splitting pretty much any size log.

We had been working steadily for a couple of hours with a lunch break and had become comfortable with the operation of the equipment but had also become a little complacent with regards the safe operation of the machine.

Both Matt and myself had had a couple of near misses.  Both of us had caught the tip of a glove in some timber movement but neither of us had stopped to think about what that really meant and what could have happened.

Both times we had those near misses it was because we were using the equipment incorrectly.  Some of the logs were too long to go on the splitting deck grain onto the blade so we had begun putting them in sideways and cutting across the grain to shorten them first.

Because the blade didn't go all the way to the back plate we also were putting a block of wood behind the logs, this was the biggest mistake.  During the cutting motion somtimes one or more pieces of wood would ride up and these were the bits which caught the glove tips.

This is what caught me.  I got two fingers caught between a piece of wood and the backing plate.  Mainly because I wasn't watching what my left hand was doing.  The automatic reaction to getting a finger wedged against something is to pull your hand away, which I did.  Unfortunately I left a couple of bits of finger and two nails behind when I did that.

They were left in the glove I was wearing.  I don't have any gory photos to show you but it was very bloody and I could see bone through the mangled finger tips.  Yes it did hurt, a lot, I did say a string of rude words.

After an ambulance ride and an overnight stay in the hospital I had plastic surgery for a couple of hours during which it seems that I still have full length fingers if a bit thinner

I am very greatful to the SA Ambulance crew who were very professional and all the Flinders Hospital staff who also were on top of their game and did their jobs with professional efficiency but were also very compassionate and caring.

To the plastic surgery team, I'll have a better idea of how well you did your job on Thursday when I get the finger reveal but it seems that you also deserve my gratitude.

I take all responsibility for this injury.  It was entirely my mistake and could have been easily avoided.  This post is just to make anyone who wants to use any power tool think a little about what they are doing, use the tool correctly and watch what both hands are doing.

I have now added a private post to this blog with photos of the repaired injury to my fingers.  I don't have any of the initial trauma, yet, but I will post them if I can get a copy.

There is a warning here, if you don't want to see these images don't click the link.

To the private post.

The next private post - Day 8

Continuing saga - Day 11

Dressing change at home - Day 14

Dressing change at home - Day 17

Thursday at FMC - Day 18

Monday dressing change FMC - Day 22

Thursday at FMC - Day 25

Thursday at FMC - Day 32

Thursday at FMC - Day 39

Thursday not at FMC - Day 46

Thursday at FMC and the Sunday after - Day 53

Now that February has ended …

Sad but true that the program is not yet ready for prime time.

What happened?

I made a seriously big mistake and have spent the last couple or three weeks fixing it.

I was testing the backup & restore functions in the WP Maintenance Robot and I boobooed.

I restored an old and incorrect database to my main website.  The backup worked fine, unfortunately, and I trashed my site completely.

I have now proved that it is possible to restore a WordPress database manually but it is a time consuming and very painful way to do it.  A couple of the tables took 4-5 days to rebuild.

I made mistakes in the rebuild as well, my WordFence plugin lost all it's settings even though I thought I had them right.

I'll just have to re-install it and set it up again.

Why did I have to test the backup and restore functions.

In their infinite wisdom Oracle have 'fixed' the 'security issue' with mysqlbackup.

They decided that putting the username and password in plain text on the command line on your own servers was insecure.

To fix this they have implemented a 'secure' system.  Now you need to create a secure and encrypted file and call that on the command line.

This is actually less secure than the previous system because a hacker no longer needs to know your username and password, all they have to do is use the default file name and all your data are theirs.

To implement this remotely for every unknown site really isn't possible so I had to do a complete rewrite of the backup process.  That was an unexpected delay but it's pretty much done.

The real test is to restore a site from a backup I made, this time I'll do a proper database dump first just in case it fouls up again.

These little issues have put the delivery of the program back by about a month I guess but there is another issue to deal with.

My wife and I have a wedding to go to in Bali in 3 weeks time, 5 weeks after we get back from that we go to Europe for 7 weeks.

If it's not ready before then you won't get to see it before the end of July.

Definitely keen to get this done and out there.

Political Agendas Have No Place In Disaster Situations.

As some of you will know, I live near Adelaide in South Australia.  You may also know that we have a serious bush fire burning quite close to this city now, the date today is Monday 5th January 2015 and the fire has been burning since last Friday.

It will be several days yet before the CFS (Country Fire Service)  can get it put out .

These people are almost all volunteers and the service will take any reasonably able-bodied person and train them to do the job.

They risk their lives to protect people and property who often don't take sufficient precautions to protect themselves but insist on living in high fire risk areas.

These statements are, I know, controversial but I am not picking on those farmers and others who do do the right thing but I am aiming this directly at those who do not.

However, there is a far bigger issue here than the residents in those risky areas.

I saw the leader of the Greens political party make a big statement about how this fire is a warning about Climate Change.

This is a load of crap.  Fires like this are a direct result of the environmentalists preventing the authorities doing burn-offs in the forests and national parks.

They claim that burning off destroys the environment.  This is not correct.  Cool burns, as experienced in a controlled burn-off reduce the understory fuel load and protects the environment.  They usually recover inside 12 months.  Trees and forests are not killed by a cool burn they are rejuvenated.  There are many plants in Australia which need their seeds to experience a burn to germinate.

Very few of the local fauna have any issues with a cool burn as it doesn't travel fast and they have little trouble in escaping the fire.  Generally only relatively small areas are burnt in any single burn-off so only small areas need to regrow and the local fauna have areas to survive in until recovery happens.

Hot burns like the one being experienced at the moment in the Adelaide Hills destroy massive areas of environment which will take years to recover.  The local fauna have little chance to escape as the fire moves so fast and they have no islands to survive in after the fire has passed.

Big bush fires are far more destructive than any series of cool burns will ever be and they cannot get a good hold on the bush if the burn-offs have been done regularly.

So, bush fires like this are a warning about not doing burn-offs due to misguided environmentalists who are the core constituents of the Green party.

Another annoyance to me is that it seems that the bulk of the environmentalist movement don't actually care about the environment at all, they just want to bleat about others not caring.

In Australia any environmentalist who is not a member of the CFS is just not serious.  The biggest threat to the Australian environment is fire, it has been for centuries, so if you are genuinely serious about protecting the environment you must be a member of the CFS or, at the very least, provide support services for those courageous volunteers.

Any of you who are CFS volunteers and also environmentalists should now comment to tell me that I'm wrong.  I'm not expecting many comments.

I have trouble believing that I have spent so much time on this program.

But it's true.  I have not given up and I do have a working program which I use for my own purposes.  Unfortunately for you it's written in Python on Mint Linux and I have had problems getting it to compile as a stand-alone program on Windows or the Mac.

Mostly due to the dependencies of the modules I used to create the program in the first place.  Yes Python is cross-platform and yes it is possible to create stand-alone programs with Python for each platform but for a non-trivial program it's quite difficult to get all environments to match each other and perform the same. Quite apart from the issues of getting the layout to look right on each platform.

Because of those problems I migrated the entire program to PureBasic and I have it working there as well.

PureBasic has the advantage of compiling to stand-alone on the Mac, Windows and Linux from essentially the same code.  There are some minor issues with the different operating systems but they can be resolved with compiler options in the code itself as long as you don't use any of the OS specific functions for any platform.

I didn't release that version though because of the Brute Force hacking of WordPress sites which is still going on.  Instead I wrote a tool to foil the hacker(s) who are behind this attack.  You can find this tool, why it works and how to use it, free to use, at the http://wpmaintenancerobot.com website.

When I had completed that program I had a program to write for work to assist in a particular function which required 100% accuracy of data transfer.  I wrote that one in Windows using Portable-Python so I could work on it at home as well as at work using the same platform, Python version, modules etc.

At home I currently run a MacBook Pro with Parallels so I can also run Windows 7 and Mint Linux all at the same time all doing different things.  These days one screen is just not enough but that's what I have to work with at home at the moment.  I have three screens at work which I like.  Multiple desktops is excellent but not quite the same.  Imagine multiple desktops on multiple screens...it would take a fair amount of discipline to keep that lot under control but I'd love to try.

Any way that project has now completed successfully and may never be used again as the contract we needed the tool for has been cancelled.

Now I need to incorporate the changes into the WordPress Maintenance Robot before you can have that one. I am hopeful that I can get this completed before the end of January 2015.

I'm thinking about how I market this program.  I think every WordPress site owner should use this, it really does make those updates easier and faster.  Yes I am aware that WordPress now does some of the updates on autopilot but not all and it doesn't update the plugins or themes for you.

Yes I do know that there are a number of other tools which will handle the updates for you and all of them either require you to add a plugin to the sites you want managed and/or are managed from an online dashboard somewhere that you don't control.

The WordPress Maintenance Robot is different in that it is a desktop tool.  It lives on your computer and is a secure as your PC or Mac.  You do not have to install a plugin to any WordPress site to make this work so you can do updates for other people if they give you the login details or even set you up with your own uniques login details.

The site access details are encrypted to a local database on your computer and are only used for site access as required.  Backups are stored securely on the site they were made on and are only accessible from the link built into the PDF report.

You can get a PDF report for the backup activity and the update activity which you can email to a client if you are doing this as a service.  I would suggest downloading their backups to your computer or other safe storage as a precaution as clients don't always do what you ask them to do.

Other backup tools mostly leave the zip file in a known location and with a known naming pattern.  If I know the location and pattern I can get your backups with a bit of brute force testing.  Once I have that your site is pwned and all your base are mine.

So, what is this functionality worth?  Frankly it's probably worth a lot more than you are willing to pay until your site is taken down or compromised.  I really would like everyone to be able to use this to protect themselves so my current thinking is that a single site will get full functionality for free.

More than one site will cost a monthly fee of $9.99 per site for up to 5 sites and then $6.99 per site for up to 10 sites and $3.99 per site for more than 10 sites.  This is not fixed yet but it indicates where I'm thinking with this at the moment.

I'm also thinking about having a fully automated version so that all you would have to do is plug in your details and wait for the regular emails to tell you all is well, updates have been made and where to get the latest backup files.

I would welcome some feedback from you on this.  In any case, all plans are subject to change with little or no notice.

It has been so long.

I have been still working on that program and it still isn't available but...

I have the website, nearly.  I registered the domain as I needed to begin the process of setting up registration etc.  The new website is at http://wpmaintenancerobot.com.  Not much to see there yet I'm afraid but it's a start.

After much fiddling around with Pascal I found that there isn't a good module to do what I want to do with a website.  The same issue with Tcl/Tk.  Both these languages are pretty advanced in lots of ways but they just didn't have anything that matches Twill in Python let alone exceeds it.

After losing several months trying to make it work I finally stepped back, took a deep breath and picked up where I left off with Python.  I'll just have to work out how to manage the compiling issue.

I have actually been having a go at that and discovered that I needed to re-write the GUI in Wx-Python because the module I was using, PythonCard, creates multiple files and doesn't compile well.

I have brought all the imports into a single file and written all the modules in the same file.  This is not considered best practice as it can be difficult to maintain a program written this way.  Unfortunately it is necessary to do it this way because the compiler tools only manage to import the dependencies for the main program and not the includes which means they won't be available and the program fails.  Spagetti code here we come.

So here we are, 6 more months down the track and what do I have to show for it?

There are still a couple of bugs in the backup and restore modules.  Nothing major I think but the backup just doesn't finish off nicely on one of my sites, I keep getting a 404 page not found error even though everything has completed and the output works on all the other blogs.  Hmmmm.  I'll find it and sort it out.

I realised that I needed to add significantly more security to the backup module.  As with almost all the backup tools I was leaving the zipped up files on the host for later download.  Anyone who got hold of my program, that would be anyone who wanted it, could identify how I was naming the files and then run a small program to check all known sites for the presence of this file, download it and have access to all the content, all the security tools in use, all the usernames and their access levels and all their password hashes.  A brute-force attack on those hashes on their own computer would find collisions and therefore access to the site fairly quickly.

That can't happen now.  The zip files are stored in the database as 64kb blobs in random order so even if an attacker gains access to the database they cannot put the files together again without the key, which isn't stored on the host.  Of course if an attacker does gain access to the database then putting the files together is not something they would bother with, they already have all your data etc.  No, this is not to protect you from direct attacks but to protect you from the attackers who have access to my program so they cannot learn how the files are stored and therefore create an automated way of downloading them from your database.

This may be overkill but I'm a bit of a belt and braces person by nature anyway.  Can't hurt to be extra careful.

The other thing I have been surprised at is the lack of serious competition in this field even after four years of development.  There are competitors but they seem to be mostly concentrating on putting all their eggs into plugin development and then using tokens to access other sites.  This requires that all the sites you want to look after also have a plugin installed.

This is a major point of difference.  My tool is a desktop based tool and all your data is safely stored in an encrypted database on your computer.  It is as safe as your computer with a little additional protection thrown in.  This does make you responsible for your own data but isn't that how it should be?  Why should I trust a third parties website and host with the login details to my sites? I'm not even really happy with cloud based storage.  It could change in the blink of an eye if the host goes belly up or decides to change the rules and locks down your data.  Nup, look after it yourself.  Use the cloud to transfer data but store it locally and remove it from the cloud when you no longer need it there.

Actually, don't remove it, overwrite it with non-sensitive data, save it and then delete it.  Any recovery done on the files then will recover the last saves data not the sensitive stuff.  Anyone who is interested in more information about WordPress security or what a secure password looks like can find it at http://guruburtlm.hubpages.com/hub/preventing-late-night-wordpress-cleaning.

Look for a progress update soon.

The continuing saga of automating the updates.

As of several months ago I do have a fully functioning WP Update tool.

It does back-up and restore to/from the host or a back-up site which you have ftp access to.

It does full update of your WordPress blog site.  That's any plugins or themes as well as the main core and it does it all on auto-pilot from a database of your blogs.

It handles MUWP sites and single stand-alone sites and it runs perfectly for me.

I run it regularly to update my sites (about 20) and I can get a PDF report for each one independently so I am very pleased with the functionality.

At this stage it is only available for Linux machines which is not what I wanted.

Did I mention at any stage how hard it is to create a single tool which can be compiled into an executable for multiple platforms?

Python is an excellent scripting language to write in and there are may libraries to add in which make doing all sorts of tricky stuff possible and the code is cross-platform capable but.

To build for the other platforms you have to build on those other platforms which means that you have to have exactly the same libraries and dependencies on each of those platforms.

This makes it a lot more difficult to keep the source files consistent across those platforms and the chance of introducing bugs is massively increased.

Add to that the use of a scripting language makes it difficult to protect your source code from all but the  most dedicated of hackers.  Full protection is impossible but good protection shouldn't be.

After months of thought, testing, discussion in forums and with people with some knowledge I found two other programming languages which have cross-platform capabilities.

I tried both TCL/TK and Free Pascal.  There are issues with both but, at this stage Free Pascal is winning.  Mainly because I can compile a full executable program from Lazarus (the IDE for FPC) and I can compile for other platforms by changing the target within Lazarus so I am really using the same code for all platforms.

More testing is required yet but it is looking promising.

Yet again I need to learn a new syntax and this one is very different so it is taking a bit longer but I am encouraged by knowing that my algorithm and code does work correctly.

Once completed the Pascal based program should be faster and I can use threading to speed things up even more.  I am still excited by the possibilities of this tool even though I have been working on it for over a year now.

How much longer?  I don't really know but perhaps 6 months as I am only a part time programmer with a busy family life and only get to write for an hour or so maximum per day.  Barely enough time to work out where I was up to last time.