Profile

Work

I am a Full-Stack Web & Software Developer with over fifteen years of experience. Over countless hours I have discovered that I have a passion and talent for creating works that are both highly functional and technically sound.

I have also discovered that I have a knack for understanding new theories & concepts – I am an adamant perfectionist when it comes to just about anything I do.

Current Skill set

Web
  • Responsive Mobile first HTML & CSS3 – Bootstrap, SASS
  • JavaScript (Vanilla JS, JQuery, Angular)
  • PHP (Web Server & CRON)
  • CRM / CMS packages – WordPress, OsCommerce, Marketo
  • MySQL
  • Git
  • SEO & PPC
Applications
  • Java
  • Python
Legacy Technology
  • ASP (classic)
  • Visual Basic
  • Adobe Flash
Software Packages
  • Adobe Photoshop
  • MySQL
  • Jet Brains Web Suite
  • Microsoft Word
  • Microsoft Excel
  • Microsoft Access
  • Maya 3d
  • and more…

Study

At the start of 2019, I am entering my 4th year of a 6 year course Studying a Bachelor of
Computer Science with gaming specialisation at Charles Sturt University

Hobbies

Pyrotechnic

A license Pyrotechnician in the state of N.S.W. Australia for over twenty years, performing
professional grade pyrotechnic shows ranging from from Chinese String Crackers, Indoor Close
Proximity Fireworks, Aerial Shells up to 125mm and Aerial Salutes up to 75mm.

A founding member of the Pyrotechnics Industry Association of Australia (PIAA), Based in Sydney
but have performing shows all over NSW for all types of events.

Car Enthusiast

An active member in the Skylines Australia NSW car club I regularly volunteer to help run events.

Social Media

You can connect with me professionally on LinkedIn,
or stalk me through Twitter

Portfolio

Employment

Wizardry Fireworks

PRODOCOM Australia

Hannover Fairs Australia

E-Web Marketing

Freelance Websites

Personal Projects

Tipping Comp

cruizen’

National Pyrotechnics

iblott accessories

Blog

Decoding Robots.txt: A Guide to SEO’s Gatekeeper for Web Crawlers

Introduction: In the intricate world of Search Engine Optimization (SEO), the robots.txt file stands as a gatekeeper, determining how web crawlers access and index your website. This blog post serves as a comprehensive guide to unravel the significance of robots.txt in SEO, providing insights, strategies, and best practices to ensure your website is effectively managed for success in search engine rankings.

  1. The Robots.txt File: Your Website’s Rulebook for Crawlers: Begin by understanding the fundamental concept of robots.txt. Explore how this plain text file serves as a set of directives, guiding web crawlers on which parts of your website to crawl and which to avoid.
  2. Robots.txt Syntax: Navigating the Directives: Dive into the syntax of robots.txt directives. Understand how to structure and format your robots.txt file, utilizing user-agents and disallow rules to communicate specific instructions to different web crawlers.
  3. User-Agents: Tailoring Directives for Different Crawlers: Uncover the power of user-agents in robots.txt. Learn how to tailor directives to specific web crawlers, ensuring that different search engines and bots receive customized instructions based on their capabilities and behaviors.
  4. Disallow Directive: Safeguarding Sensitive Content: Explore the disallow directive and its role in instructing web crawlers to avoid certain sections of your website. Learn how this directive safeguards sensitive content from being indexed, enhancing your control over what appears in search engine results.
  5. Allow Directive: Granting Access to Specific Content: Delve into the allow directive as a counterpart to disallow. Understand how to selectively grant access to specific content, ensuring that valuable pages are crawled and indexed by search engines.
  6. Wildcard Usage: Scaling Robots.txt for Efficiency: Embrace the power of wildcards in robots.txt. Explore how using wildcards can simplify the management of directives, allowing you to apply rules to entire sections of your website efficiently.
  7. Crawl Delay: Regulating Crawl Frequency for SEO Harmony: Prioritize crawl delay in your robots.txt strategy. Understand how this directive regulates the frequency at which web crawlers access your site, preventing potential strain on your server resources.
  8. Testing and Validating Robots.txt: Ensuring Effective Communication: Dive into the importance of testing and validating your robots.txt file. Learn how to use tools like Google Search Console to ensure effective communication between your directives and web crawlers, preventing unintentional issues.
  9. Handling Dynamic URLs and Parameters: SEO Considerations: Explore strategies for handling dynamic URLs and parameters in robots.txt. Understand how to navigate the challenges posed by dynamic content and e-commerce platforms, ensuring that essential pages are properly crawled.
  10. Robots.txt and SEO Best Practices: Navigating the Future: Conclude with SEO best practices related to robots.txt. Explore considerations for future-proofing your robots.txt file, adapting to evolving search engine algorithms, and maintaining a balance between openness for indexing and protection of sensitive content.

Conclusion: In the dynamic landscape of SEO, the robots.txt file stands as a critical tool that shapes how web crawlers interact with your website. By mastering the intricacies of creating and optimizing your robots.txt file, you not only gain control over the indexing process but also contribute to a more efficient and harmonious relationship with search engines. Implement the insights from this guide to ensure your robots.txt becomes a strategic asset in your quest for SEO success.

Published by
March 5, 2024 1:02 am

Comments are closed here.


Creative

I studied Digital Media for 12 months at Mt Druitt TAFE. While  doing so we studied The 3D Modeling Program ‘Maya’. During the course I produced a number of 3D scenes and a couple of 3D animated movies.

I have also played in other programs such as Bryce 3D, 3D Studio Max, Lightwave, Vue D’esprit and a few others. However, i have always returned to Maya as a personal preference.

Bellow are a number of works that i have produced from these various programs.

Bryce 3D: Balls

Vue D’esprit: Sulfuric

Maya: Living Room

Maya: Gauntlet

Loading...