TurboAnchor

Complete SEO Project Management SAAS Tool



TurboAnchor is complete SEO Project Management SAAS tool. It can manage all aspects of the SEO projects from staff management to the minute details of On & Off page. With content management included in the tool there is no need for multiple tools for writing.

It can analyze site visitors and website information such as Alexa data, SimilarWeb data, whois data, social media data, Moz check, search engine index, IP analysis, malware check etc. combined with some other great SEO tools such as link analysis, keyword position analysis, auto keyword suggestion, page status check, backlink creation/search, website ping, Google Ads scraping etc.

Most importantly it has a Magic Link Finder that fetches all the keyword related data including Web 2.0, Forums, Guest Posting, Article Directories and much more that you can think of.

It’s a 1 stop shop for all your SEO requirements.

You can also hire our team of professionals if you do not want to get involved into content development & link building.

Go to the TurboAnchor homepage https://turboanchor.com/index.php and click “Sign Up”. Fill the sign up form and verify the confirmation email.
Log in with your credentials via web address https://turboanchor.com/home/login_page

You can save all of your project here and manage them individually.

Click “Add Project” that will lead to the add new project window

To make changes to the project, Choose “Edit Project” and you will be redirected to the above dialogue again where you can edit the project.

To select a project you want to work on, please select “Manage Project” and you will be redirected to the project’s dashboard, where you can manage that project.


While in the “Project Management” select “Add Configurations” for the

How to get Google API

Follow these steps to get an API key for Google

Several TurboAnchor features need proxy to work.

Select “Manage Proxy” from the Project Management window and assign each project single or individual proxy.

Click “Add” under proxy settings.

Enter the proxy details & click “Save”, we recommend you to buy good paid proxies for seamless experience.

From the staff management menu you can designate two type of access, “Writer” for content managers and “Link Builder” for link building staff.

While a user is assigned certain role he/she will only have the access to the related features only and not others.

Add Staff:

Under the “Staff Management” tab select “Add Staff”. Fill the desired information and click “Save”. Staff will be added and you will be redirected to the staff management page. Where you can edit or delete the staff.

Edit Staff:

On “Staff Management” under “Edit Data” column click “Edit Staff”. Make the desired changes and click “Save”.

Delete Staff:

On “Staff Management” under “Actions” column click “Delete”. Confirm your action upon confirmation prompt and user will be removed from the “Staff List”.

How to Use Google Analytics & Webmasters

  1. Go to Add New Project/Edit Project
  2. Add Google Analytics View ID
  3. Add Google authentication code

click save and you are good to go.

Finding the Google Analytics View ID

TurboAnchor needs the View ID of the view that you want to analyze. A view is the lowest level of your Google Analytics account (Account > Property > View.)

To get your View ID log into GA and:

  1. Navigate to the “Admin” tab.
  2. Select your account, property and view.


  1. Select “View Settings” in the rightmost column.
  2. Copy the number below View ID and paste it in your Keyword Hero backend.


How to Get Google Authentication Code

1. Go to Add New Project/Edit Project

2. Click Get Code

3. Follow the Google instructions, make sure both analytics & webmaster are checked



4. Copy & Paste the code in Google authentication code field



click save and you are good to go.

Dashboard window displays all the major stats on a single page both graphically and text form, of the selected project.




Website Analysis


  • Web analysis is for analyzing any website.
  • Suppose, you are an owner of a website http://xyz.com and you want to analyze one of your competitor's site. All you have to do is to type a website domain name and start an analysis.

Terminology

  • Whois Information: Whois information is the basic technical information like domain name, register or not etc. about a website.
  • IP Information: IP information is the info about IP address of the computer that host the website.
  • MozRank / Moz Information: MozRank (https://moz.com/) represents a link popularity score. It reflects the importance of any given web page on the Internet. Pages earn MozRank by the number and quality of other pages that link to them. The higher the quality of the incoming links, the higher the MozRank.
  • Malware Scan Information: Malware scan info is the malware scan results of the files and content of the website by Google safe browser & Norton. In v4.0 we have added a new malware scan tools that scans in 67 different places named VirusTotal.
  • Google Backlink: A backlink is any link received by a web node (web page, directory, website, or top level domain) from another web node. Google backlink is the count of backlinks found in Google search for a website.
  • Google Page Rank: Google PageRank (Google PR) is one of the methods Google uses to determine a page's relevance or importance. Important pages receive a higher PageRank and are more likely to appear at the top of the search results. Google PageRank (PR) is a measure from 0 - 10. Google PageRank is based on backlinks.
  • Social Network Information: Like, share, comment etc. social network activities in
    • Facebook
    • Pinterest
    • Reddit
    • Buffer
    • Xing
    • Stumble upon
    • LinkedIn
  • Keyword & Meta Information: A keyword, in the context of search engine optimization, is a particular word or phrase that describes the contents of a Web page. Keywords are intended to act as shortcuts that sum up an entire page. A Meta description tag is a snippet of HTML code in a web page header that summarizes the content that is on the web page. The Meta description is usually placed after the title tag and before the Meta keywords tag. When optimizing a web page for search engines (SEO), it is considered a best practice to use Meta description tags.
  • Search Engine Index: Search engine indexing is the process of a search engine collecting, parses and stores data for use by the search engine. The actual search engine index is the place where all the data the search engine has collected is stored. It is the search engine index that provides the results for search queries, and pages that are stored within the search engine index that appear on the search engine results page. Without a search engine index, the search engine would take considerable amounts of time and effort each time a search query was initiated.
  • Alexa & SimilarWeb Information: Alexa (http://www.alexa.com) and SimilarWeb (https://www.SimilarWeb.com) provide analytical insights to benchmark, compare and optimize businesses on the web. We have added this two popular web analytics tool's publicly available statistics for you acknowledgement :)




General Information




Social Network Information





Keyword and Meta Information





Alexa Information





SimilarWeb Information



Keyword Analyzer

  • Keyword Analyzer is a great tool to analyze of any pages content. Just enter any URL and start analysis. It will give you the title and all meta tag of the page, Is it block by the robot.txt or blocked by meta-robots, Total Keyword, All h1,h2,h3,h4,h5,h6 content. Single, 2 phrase, 3 phrase, 4 phrase keywords with number of occurrence, density, spam possibility.

  • Check any keyword in Google, Bing, and yahoo search engine to get the position of your website in search result. Just put your keyword and website address, you will get the position in Google, Bing, and yahoo search engine for your keyword. It also give you all other, website address which comes at top during search. It check from first 100 search result.
  • Finally when searching will be finished, you can see the result and can download.
  • In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.










  • All search engine give suggestions when you write your search text in input field. Here we collect auto suggestion keywords for your search keyword. You will get auto suggestion keywords of Google, Bing, Yahoo, Wikipedia and Amazon. It has a flexibility to check all these search engine or choose yourself.
  • Finally when searching will be finished, you can see the result and can download.
  • In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.






Step 1: Type website address and submit

Step 2: Download PDF report/ Share report to social media

Step 3: Download report

Step 4: Add competitor’s website and click “Compare”

All the Website Health Reports are saved here in order of time, and can be accessed by clicking the “Report” button against respective report or reports can also be searched by website name & date.

You can view all the saved website comparative websites health reports in this section as shown above.

  • You can now minify any HTML, CSS & JS files. Minified HTML, CSS and JS will make your site faster to load.
  • You can copy HTML, CSS, JS code or can upload multiple files to minify

Simply paste or upload the HTML code file you want to minify and click “Minify the Above Code” & leave rest to the tool.



Simply paste or upload the CSS code file you want to minify and click “Minify the Above Code” & leave rest to the tool.



Simply paste or upload the Js code file you want to minify and click “Minify the Above Code” & leave rest to the tool.



Check the Gzip size for any website, simple type the URL and hit “Check”

Check the URL canonicals size for any website, simple type the URL and hit “Check”

Feature

  • Actually crawls webpages like Google would
  • Generates seperate XML file which gets updated every time the script gets executed (Runnable via CRON)
  • Awesome for SEO
  • Crawls faster than online services
  • Adaptable

Usage

Usage is pretty strait forward:

  • Configure the crawler by modifying the sitemap-generator.php file
  1. Select URL to crawl
  2. Select the file to which the sitemap will be saved
  3. Select accepted extensions ("/" is manditory for proper functionality)
  4. Select change frequency (always, daily, weekly, monthly, never, etc...)
  5. Choose priority (It is all relative so it may as well be 1)
  • Generate sitemap
  1. Either send a GET request to this script or simply point your browser
  2. A sitemap will be generated and displayed
  3. Submit sitemap.xml to Google
  4. Setup a CRON Job to send web requests to this script every so often, this will keep the sitemap.xml file up to date

The script can be started as CLI script or as Website. CLI is the prefered way to start this script.

CLI scripts are started from the command line, can be used with CRON and so on. You start it with the php program.

CLI command to create the XML file: php sitemap-generator.php

To start the program with your Webserver as Website change in the script the line 22 from

CLI

The script can be started as CLI script or as Website. CLI is the prefered way to start this script.

CLI scripts are started from the command line, can be used with CRON and so on. You start it with the php program.

CLI command to create the XML file

To start the program with your Webserver as Website change in the script the line 22 from

BASIC usage

Scan http://www.mywebsite.com/ and output the sitemap to /home/user/public_html/sitemap.xml:

php sitemap.php file=/home/user/public_html/sitemap.xml site=http://www.mywebsite.com/

Type URL in the search bar and hit Enter, the tool will fetch the redirect status of that URL.

Selecting Google Search Console will open Google Search Console page in new tab.

  • Search engine indexing is the process of a search engine collecting, parses and stores data for use by the search engine. The actual search engine index is the place where all the data the search engine has collected is stored. It is the search engine index that provides the results for search queries, and pages that are stored within the search engine index that appear on the search engine results page. Without a search engine index, the search engine would take considerable amounts of time and effort each time a search query was initiated.
  • All you have to do is to click "New Search" button and put domain names or upload text/csv file and then click "Start Searching" button.
  • It may take some minutes depending number of websites you provide.
  • Then finally when searching will be finished, you can see the result and can download.
  • In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.


A robot is a simple text files which keeps the list of search engines that are allowed or disallowed to crawl any website and directories of that websites that can be or cannot not be crawled by the search engines. It also contains the time of search delay for the search engine and xml sitemap link of the website.

Our robot code generator is very simple. By clicking robot code generator menu your will get a form. Fill it with suitable data. You can allow all search engine to crawl all your directories of website i.e. full access to your website or full block to your website or you can customize the crawling of search engine by clicking specific search robots. Fill the form properly and click save. A pop up window will come and you will get option to download your robot txt file. Rename it to robots.txt and save it to your website root directory. Please ensure proper permission for robots.txt to read by the search engines.

Keyword Position Trackin

Track the position of a website for particular keyword every day.

This is similar to "keyword position analysis" except users do not have to search every time they need to know the position. Users (admin/others) will simply set theirs keywords & websites for tracking position once. System will automatically get positions every day.
Please remember that one user can add only 30 keywords.
Users (admin/others) can see tracking report anytime with date interval search.

Click “Add” in order to add project’s keywords for tracking.

Select Keyword & Date for the keyword position report and click Search.

  • Social network analysis crawls social activities of a website. It supports bulk website analysis. All you have to do is to click "New Analysis" button and put domain names or upload text/csv file and then click "Start Analysis" button.
  • It may take some minutes depending on number of websites and number of social networks you select.
  • Then finally when analysis will be finished, you can see the result and can download.
  • In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.


Click “Download” for local copy of the report

Article Generator

Generate unique content from any text that you like by simply spinning it in our spinner, for ideas and how to write the content you can check the example sites in the side bar.

  • Paste Content in the Spin Text box. (Limit 1000 words)
  • Choose Language from the dropdown menu.
  • Once content is spun click Check Plagiarism to make sure your content is up to mark. (Limit 1000 words)

Google Sheets

Export all your results into your Google sheets, just sign in and save it.

  • Click Add Website
  • Enter Sheet Title & Sheet URL
  • Click Save




Export all your results into your Google sheets, just sign in and save it.

  • Click Add Website
  • Enter Sheet Title & Sheet URL
  • Click Save




Security Tools

Scan any websites’ malware status easily. Check the site is affected by malware or not. We have applied four most popular malware scanning sites information. You can check in Google Safe Browsing, McAfee, AVG and Norton. For Google Safe browsing you needs your own API key. It is described above how to get the API key for Google safe browsing check.
All you have to do is to click "New Scan" button and put domain names or upload text/csv file and then click "Start Scanning" button.
It may take some minutes depending number of websites you provide.
Then finally when searching will be finished, you can see the result and can download.
In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.

VirusTotal tools can scan in 67 different places and give you the scan report.

Whois Search Information

Who is information is the basic technical information like domain name, register or not etc., about a website.
All you have to do is to click "New Search" button and put domain names or upload text/csv file and then click "Start Searching" button.
It may take some minutes depending number of websites you provide.
Then finally when searching will be finished, you can see the result and can download.
In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.

Years ago, when domain names expired they would drop, or become available for hand registration by anyone. Whoever was the quickest to register a dropped domain would be the new registrant. That is how many of the largest domainers, built their enormous portfolios. Today, the story is different. Domain name registrars realized that they could auction expired domain names to the highest bidder and generate additional revenue. If no one wanted the domain names in an auction, the domains would then drop and become available for anyone to register. Much of the time, however, domain names are successfully auctioned. A domain name that reaches expired status and is not renewed by the owner will be listed at an auction service. The domain name does not immediately go into auction, but is immediately listed on the partnered auction service with an auction scheduled for the near future. Such a domain name will be exclusive to that specific auction service.
Here, we have listed expired domains with its current page rank & index that are available for auction. You have to setup a cron job (see cron job section) for auto sync expired domain data.

Check the domain name system information like URL, Host, Type, IP, Class, TTL for single/multiple domains simultaneously. Paste domain/domains URLs comma separated in the input box and hit “Check”.


Click on the Details Bino sign to view the DNS information details.

Check the server details for single/multiple domains simultaneously. Paste domain/domains URLs comma separated in the input box and hit “Check”.


My IP Information

Check your own IP address and your location (latitude, longitude), organization, Region, City, Postal Code, and Country.

Check any domain's hosting ip address along with ISP, Country, City, Time Zone, latitude, longitude. It has bulk domains searching option.
All you have to do is to click "New Search" button and put domain names or upload text/csv file and then click "Start Searching" button.
It may take some minutes depending number of domains you provide.
Then finally when searching will be finished, you can see the result and can download.
In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.

Most of the website is hosted on shared server. Where one IP address is pointed to access many website.
If you want to see which websites are hosted on an IP address, then input IP address and start searching. There may many website can hosted on shared IP. We have displayed maximum 100 domains per IP.


Check if your website has IPv6 compatibility or not in bulk.


Check if your website has canonicals.


Trace out IP location by searching your domain IP, if you don’t know your domain IP you can search by clicking.


Alexa Rank

Alexa (http://www.alexa.com) provide analytical insights to benchmark, compare and optimize businesses on the web. We have added its publicly available statistics for you acknowledgement. This feature will only show Alexa rank info only, detailed data is shown in Alexa Data.
All you have to do is to click "New Search" button and put domain names or upload text/csv file and then click "Start Searching" button.
It may take some minutes depending number of websites you provide.
Then finally when searching will be finished, you can see the result and can download.
In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.

This feature will grab Alexa’s full public data set and will show as a dashboard. All you have to do is to click "New Search" button and put domain names or upload text/csv file and then click "Start Searching" button.
It may take some minutes depending number of websites you provide.
Then finally when searching will be finished, you can see the result and can download.
In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.

This feature will grab SimilarWeb's full public data set and will show as a dashboard. All you have to do is to click "New Search" button and put domain names or upload text/csv file and then click "Start Searching" button.
It may take some minutes depending number of websites you provide.
Then finally when searching will be finished, you can see the result and can download.
In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.

MozRank (https://moz.com/) represents a link popularity score. It reflects the importance of any given web page on the Internet. Pages earn MozRank by the number and quality of other pages that link to them. The higher the quality of the incoming links, the higher the MozRank.
All you have to do is to click "New Search" button and put URLs or upload text/csv file and then click "Start Searching" button.
It may take some minutes depending number of URLs you provide.
Then finally when searching will be finished, you can see the result and can download.
In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.


Page Status Check is used to check any URLs status. It checks if the URL is live or 404 not found and also handles all 56 http responses. It will give you the page download speed, total time, connecting time, Name LookUp Time. It has bulk checking feature.
All you have to do is to click "New Search" button and put domain names or upload text/csv file and then click "Start Searching" button.
It may take some minutes depending number of URLs you provide.
Then finally when searching will be finished, you can see the result and can download.
In the landing page, you will see list of all previous result you generated and you can filter the display. You can bulk download them or bulk delete them whenever you want.

Email Encoder/Decoder

This form will allow you to encode your e-mail address through the use of Character Entities (https://www.w3.org/TR/REC-html32), transforming your ASCII email address into its equivalent decimal entity. Simply put your email address (Bulk), Start Encoding, You will get encoded email which can be visible original in browser but in html it will encoded. To prevent from email scraper, this technique is used mostly.



Valid email checker is a tool to check your emails are valid email id or not. You can enter directly list of emails into text field or you can upload a text file which contains the emails. After entering your list or file upload hit start searching button. The tool will check for valid emails in both email format and domain name. After checking a pop up window will come and you can download the valid email list in a csv file by clicking download.



Just like valid email checker. It searches for unique email ids from a list of emails. Enter list of your emails directly or upload a text file which contains your emails. Hit start searching button. The tool will search for unique email and provide you option to download a csv file which contains all the unique emails from your list.



URL encoder/decoder is tool to encode and decode URLs to encoded string and plain text. Just enter your URL list directly into text field or upload a text file of URL list. Hit start searching. After searching is completed a pop up window will come and you have option to download your encoded or decodes URLs into a csv file.



Base64 is a way to encode binary data into an ASCII character set known to pretty much every computer system, in order to transmit the data without loss or modification of the contents itself. For example, mail systems cannot deal with binary data because they expect ASCII (textual) data.
Simply paste your content and select Start Encoding or Start Decoding as per needed.



Check your usage details and remaining credits.

You can select the package and click "Subscribe" to make payments, further payment history is displayed below for reference.