Hosting Frequently Asked Questions
- What are the SwitchWorks DNS server names for ADSL Users?
- How do I keep crawlers and spam bots away from my web page?
- What is Virtual Web Hosting?
- Will I be able to change and update my web pages on a regular basis?
- How can I change my domain name?
- Can I use a SSL (Secure Socket Layer) Certificate to secure my website?
- How do I administer email accounts?
- How many domain names can I have on one package?
- Why am I not able to see my web site even after 10-12 hours?
- How do I transfer files to my web site?
- Can my site be set up so that www.mydomain.com and mydomain.com both point to the same web site?
- Do you host international domains?
- How do I make changes to my personal webspace?
When you install and setup some Internet software on your computer (such as web browsers, e-mail applications, or FTP clients), you will likely need to enter some server information before the program will function properly. Provided here is a list of the most commonly used servers at SwitchWorks and the correct server names and/or IP addresses that should be specified.
|Server Type||Server Name or IP Address|
|Domain Name Servers||Primary DNS||184.108.40.206|
|Misc. Servers||Personal Website||home.mycybernet.net|
A crawler, robot (or “bot”), spider or wanderer is a computer program that searches the Internet and records information about web pages. For example, a “spam bot” will search the pages in web sites and record the e-mail addresses linked in home pages.
Although you can’t keep away all robots, you can give instructions to certain ones that follow the “Robot Exclusion Protocol”. This method requires that you create a plain text file called robots.txt and place it in your site’s root directory. For example, if your web site URL is http://www.mycompanydomain.com/ then the file robots.txt must be accessible from http://www.mycompanydomain.com/robots.txt in order to restrict your site. It is important to note that the file must be in your root directory (ie. ~/public_html/) and no other.
The contents of robots.txt consists of mainly two commands: “User-agent” and “Disallow”.
The “User-agent” command allows you to set restrictions on a robot with a particular name (or signature). You can set this to the asterisk (*) to specify that restrictions apply to all robots that aren’t identified elsewhere in the file.
The “Disallow” command allows you to deny access to certain directories in your web site.
Example of robots.txt
Say your URL is http://www.mysite.com. If you put a robots.txt file into your ~/public_html/ directory containing the following:
Figure 1: sample robots.txt file
In the example, all robots would have free access to the web site except for files contained in the /neat_stuff/ and /my_pvt_stuff/ directories, but the WebCrawler robot is denied all access to the site.
NOTE: Since not all robot authors acknowledge the “Robots Exclusion Protocol”, it is not possible to stop all robots using this method. However, most search engine robots do follow this protocol. Please refer to documentation on their sites for more information on this.
Using the “ROBOTS” META tag
Using the META tag method, you can specify restrictions in your web pages individually. Unfortunately, this method is less recognized by robots than the robots.txt method. The META tag has two parameters in its content. They are INDEX or NOINDEX. and FOLLOW or NOFOLLOW. See the following examples:
Figure 2: sample of “ROBOTS” META tag
In this example, the robot is instructed to neither index the current page, nor to follow links in the page for indexing.
Figure 3: sample of “ROBOTS” META tag
In this example, the robot is instructed not to index the current page, but allows it to follow links in the page for indexing. The structure of the META tag should be clear by now.
NOTE: All META tags should be specified within the block of your HTML document.
- For best results, you should use the robots.txt method if possible. Simply FTP the file into your ~/public_html/ directory.
- The META tag method can be used if you can not use the robots.txt method.
- Both methods can be used together.
- If privacy of your web pages is essential, password protect your page using an .htaccess file (shell access is needed for this). Robots can’t enter a password protected page without the password.
We specialize in virtual web hosting, which means you can find a home for your website on our high performance web servers and establish your presence on the Internet with your own unique domain name. This is a cost effective alternative to hosting your own website internally. Our shared hosting environment gives you the benefits of high performance servers, high bandwidth connectivity that can seamlessly grow with your needs, pre-installed software, guaranteed reliability, and round the clock technical support, all at a fraction of the cost of doing it yourself.
Yes, you will be able to update your web page at any time using FTP. There are no additional charges for updating your web site, so you can update it as often as you want. If you need more information on how to do this, contact SwitchWorks technical support for more information.
You can request to change the domain name on your shared hosting plan as many times as you want. However, there is a one time processing fee every time make a request. Before you request a domain name change, make sure your new domain name is registered and its DNS is pointing to the same Nameservers as your existing domain.
You can request a domain name change by sending an email to our Sales Department with your current domain name and password as verification and authorizing the processing fee.
Yes, you can have a SSL (Secure Socket Layer) certificate installed on the server level that your domain can use. You can purchase a Shared SSL or If you want a SSL certificate exclusively for your domain name, you can purchase one through us. We offer GeoTrust QuickSSL Premium Certificates.
For more information, please contact our Sales Department.
You can administer your email accounts from your websites Control Panel (www.yourdomainname.com/admin)
SwitchWorks provides single domain hosting only. However, you can point multiple domains to the main site.
If you have modified the Name Servers for your domain or signed up for a New Domain, it will take 24-48 hours for you to be able to see your site on our servers. The reason for this delay is that when name servers are modified or a new domain is registered, this information has to be propagated across the Internet before you will be able to see the changes.
Files can be transferred to the web server using an FTP program (File Transfer Protocol).
Microsoft FrontPage users can ‘publish’ their sites to our server and should not use FTP.
By default www.domain.com and domain.com will point to the same place.
We host all domains authorized by ICANN including all TLDs and country codes.
You will need to have an FTP software to access the server.
Hostname: ftp.domainname (ftp.sampledomain.com)
Username: (Username that was given to you for your hosting account)
Password: (Password that was given to you for your hosting account)