Private Domain Registration

Private domain registration deal

Did you know that when you register a domain, your name, address, email address and phone number are immediately made available to anyone who wants to see them? That’s right. Your personal information is exposed 24 hours a day, everyday, to anyone, anywhere.

You have the power to change this. With a private (“unlisted”) registration through Domains By Proxy (our affiliate company), registering a domain name doesn’t mean sacrificing your privacy.

How It Works

  • – Your new or existing domain is registered in the name of Domains By Proxy — so their information is made public — not yours.
  • – You retain full benefits of domain registration. You can cancel, sell, renew, or transfer your domain; set-up name servers for your domain; and resolve disputes involving your domain.
  • – The patented registration and email handling systems let you manage and control all postal mail and email addressed to the domain you have registered, as well as the domain’s contact information.
  • – Your domain registration is safe and insured against loss.
  • – But don’t even think about using a private registration to transmit spam, violate the law, or engage in morally objectionable activities.

A Private “Unlisted” Registration protects you from:

  • – Domain-related spam
  • – Harassers, stalkers, and data miners
  • – “Moonlighting,” home- or side-business identity disclosure
  • – Privacy intrusions

Adding Private Registration to Your Domain Names Per our agreement with the Internet Corporation of Assigned Names and Numbers (ICANN), we must add valid contact information to the Whois directory for each domain name you register. By purchasing private registration from our affiliate company, Domains By Proxy® (DBP), you can hide your personal contact information and display proxy information instead. Click here to see the difference.

Private registration is available for many domain name extensions. However, some registries do not allow privacy. If the registry does not allow privacy, the Privacy tab is not available in the domain name’s Upgrade section of the Domain Manager.

To Add Privacy to Your Domain Names

  1. 1. Log in to your Account Manager.
  2. 2. In the My Products section, click Domain Manager.
  3. 3. Select the domain name(s) you want to add privacy to.
  4. 4. Click Upgrade.
  5. 5. From the Registration tab, select Privacy.
  6. 6. Click Add, and then select one of the following:
    • – Create a new account – If you do not have a DBP account, DBP creates a new one for you. Then, DBP sends you an email message with your user name, password, and a link to the DBP website. You should save the email message for your records.
    • – Use an existing account – If you already have a DBP account, enter your login information in the User Name and Password fields.
  7. 7. Click Checkout, and complete the checkout process.

It can take 24 to 48 hours for the DBP information to display in the Whois database.

For more information about using your DBP account, see What can I do in my Domains By Proxy account?


Private Domain Registration Deal:

Get FREE Private Registration ($7.95/yr value) when you register or transfer five or more domains, NO QUANTITY LIMIT! Protect yourself from spam, fraud, stalkers and worse by keeping your name, address, email and phone number private.

Private Registration MUST be added to your cart before checkout, in order to qualify for this offer.

This offer is valid for .COM, .INFO, .NET, .ME, .ORG, .BIZ, .NAME, .MOBI, .MX, .COM.ES, .WS, .NOM.ES, .ES, .ORG.ES, .NL, .COM.MX, .BZ, .COM.BZ, .CC, .NET.BZ, and .TV.

Google and Duplicate Content Guidelines

Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin. Examples of non-malicious duplicate content could include:

Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices
Store items shown or linked via multiple distinct URLs
Printer-only versions of web pages
If your site contains multiple pages with largely identical content, there are a number of ways you can indicate your preferred URL to Google. (This is called “canonicalization”.) More information about canonicalization.

However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.

Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has a “regular” and “printer” version of each article, and neither of these is blocked with a noindex meta tag, we’ll choose one of them to list. In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.

There are some steps you can take to proactively address duplicate content issues, and ensure that visitors see the content you want them to.

Use 301s: If you’ve restructured your site, use 301 redirects (“RedirectPermanent”) in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.)

Be consistent: Try to keep your internal linking consistent. For example, don’t link to http://www.example.com/page/ and http://www.example.com/page and http://www.example.com/page/index.htm.

Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We’re more likely to know that http://www.example.de contains Germany-focused content, for instance, than http://www.example.com/de or http://de.example.com.

Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.

Use Webmaster Tools to tell us how you prefer your site to be indexed: You can tell Google your preferred domain (for example, http://www.example.com or http://example.com).

Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details. In addition, you can use the Parameter Handling tool to specify how you would like Google to treat URL parameters.

Avoid publishing stubs: Users don’t like seeing “empty” pages, so avoid placeholders where possible. For example, don’t publish pages for which you don’t yet have real content. If you do create placeholder pages, use the noindex meta tag to block these pages from being indexed.

Understand your content management system: Make sure you’re familiar with how content is displayed on your web site. Blogs, forums, and related systems often show the same content in multiple formats. For example, a blog entry may appear on the home page of a blog, in an archive page, and in a page of other entries with the same label.

Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.

Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can’t crawl pages with duplicate content, they can’t automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel=”canonical” link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools.

Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results. If your site suffers from duplicate content issues, and you don’t follow the advice listed above, we do a good job of choosing a version of the content to show in our search results.

However, if our review indicated that you engaged in deceptive practices and your site has been removed from our search results, review your site carefully. If your site has been removed from our search results, review our webmaster guidelines for more information. Once you’ve made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration.

If you find that another site is duplicating your content by scraping (misappropriating and republishing) it, it’s unlikely that this will negatively impact your site’s ranking in Google search results pages. If you do spot a case that’s particularly frustrating, you are welcome to file a DMCA request to claim ownership of the content and request removal of the other site from Google’s index.

Sourced from: Google Webmaster Guidelines

Yahoo Website Optimizer

After Google, Yahoo is the very important search engine to drive website traffic to website. To help webmaster to have a clear information about “Yahoo Website Optimizer” we would like to share with webmaster/ developer what Yahoo “Wants” and “Unwanted” for their search results:

Pages Yahoo! Wants Included In Its Index:
Original and unique content of genuine value.
Pages designed primarily for humans, with search engine considerations a secondary concern.
Hyperlinks intended to help people find interesting, related content, when applicable.
Metadata (including title and description) that accurately describes the contents of a web page.
Good web design in general.
Unfortunately, not all web pages contain information that is valuable to a user. Some pages are created deliberately to trick the search engine into offering inappropriate, redundant or poor-quality search results. This is often called “spam.” Yahoo! does not want these pages in the index.

What Yahoo! Considers Unwanted:
Some, but not all, examples of the types of content that Yahoo! does not want include:
Pages that harm the accuracy, diversity or relevance of search results.
Pages dedicated to redirecting the user to another page (doorway pages).
Multiple sites or pages offering substantially the same content.
Sites with numerous, unnecessary virtual hostnames.
Pages produced in great quantities, which have been automatically generated or which are of little value (cookie cutter pages).
Pages using methods to artificially inflate search engine ranking.
The use of text or links that are hidden from the user.
Pages that give the search engine different content than what the end user sees (cloaking).
Sites excessively cross linked with other sites to inflate a site’s apparent popularity (link schemes).
Pages built primarily for the search engines or pages with excessive or off-topic keywords.
Misuse of competitor names.
Multiple sites offering the same content.
Sites that use excessive pop-ups which interfere with user navigation.
Pages that seem deceptive, fraudulent, or provide a poor user experience.
Yahoo! Search Content Quality Guidelines are designed to ensure that poor-quality pages do not degrade the user experience in any way. As with other Yahoo! guidelines, Yahoo! reserves the right, at its sole discretion, to take any and all action it deems appropriate to ensure the quality of its search index.

Sourced from: http://help.yahoo.com/l/us/yahoo/search/basics/basics-18.html

Bing Not Indexing My Site Properly

Many people complaint that “”Bing Not Indexing My Site Properly”. Bellowing is the solution for this from Brett Yount – Program Manager | Bing Webmaster Center

if your site is not in the index, please do the following:
1. verify in our tools that your site is not blocked
2. run a site: query to verify there are no pages in the index
3. Copy the URL of the site query and post on this thread.

I will work with you to at least get your home page indexed. Deeper indexing will require good content and backlinks as described in the FAQ.

Learn more about driving website traffic to your site here.

Smart Link Building For Bing Webmaster

How to be successful at link building from a search engine’s perspective? Smart link building means drive more website traffic to your site!
We would like to share with you about this topic through an article from Rick DeJarnette, Bing Webmaster Center. This is special for Bing search engine. It also helpful information for link building for other search engines.

What is the point of building links?

Your website is your self-representation on the Web. It’s a major asset to your business, often simultaneously serving as your online business card, an introductory company brochure, detailed sales literature, supporting documentation, and a point of sales distribution point for your products and/or services. It’s also your place to demonstrate your expertise in your specialized field of interest. If your website offers something of worth, valuable to web users interested in that topic, then it behooves you to let the world know about it. Consider the effort your contribution to the betterment of humanity (or at least a chance to make a few conversions!).

Link building is a very important form of self-promotion on the Web. You contact webmasters of other, related websites and let them know your site exists. If the value that you have worked so hard to instill in your site is evident to them, they will assist their own customers by linking back to your site. That, my friend, is the essence of link building.

Think of link building as your chance to build your reputation on the Web. As your website is likely one of your business’ most valuable assets, consider link building to be a primary business-building exercise. Just don’t make the mistake of believing it will result in instant gratification. Successful link building efforts require a long-term commitment, not an overnight or turnkey solution. You need to continually invest in link building efforts with creativity and time. Good things come to those who wait (and work smartly!).

Bing’s policy on link building

Bing’s position on link building is straightforward – we are less concerned about the link building techniques used than we are about the intentions behind the effort. That said, techniques used are often quite revealing of intent. Allow me to explain.

Bing (as well as other search engines) places an extremely high priority on helping searchers find relevant and useful content through search. This is why we regularly say that search engine optimization (SEO) techniques oriented toward helping users are ultimately more effective than doing SEO specifically for search engine crawlers (aka bots).

The webmasters who create end user value within their websites, based on the needs of people, are the ones who will see their page rank improve. So where does that value come from? Content. Good, original, text-based content.

How do I get valuable inbound links?

Make no mistake: getting legitimate and highly valuable, inbound links is not a couch-potato task. It’s hard work. If it were easy to do, everyone would do it and everyone would have the same results – mediocrity. But this is not to say that it is impossibly hard or that successful results are unattainable. Persistence and diligence are extremely important, but so is having something of value, content-wise, to earn those inbound links to your site.

We’ve said it before, and you’ll hear it said again: content is king. Providing high-quality content on your pages is the single most important thing you can do to attract inbound links. If your content is unique and useful to people, your site will naturally attract visitors and, as a result, automatically get good links to your site. By focusing on great content, over time, your site will naturally acquire those coveted inbound links.

But are all inbound links created equal? Not at all. Your goal should be to focus on getting inbound links from relevant, high-quality sites that are authorities in your field.

Site relevance

Relevance is important to end users. If you run a site dedicated to model trains, getting an inbound link from an illicit pharmaceutical goods site is orthogonal to the interests of your customers. Unless the outbound linking page from such a site makes a relevant case for linking to you, this type of unrelated link is of minimal value (and if the intention is determined to be manipulative, may even lead to penalties against your site). Why? Because so many sites today are set up solely to serve as link exchanges, where they have no specific theme to their site (other than seemingly random – and usually paid for – outbound links). As these sites do nothing to advance the cause of the web user looking to find useful information, search engines regard them as junk for end users, and thus as junk links to their linked-to sites.

You see, search engines know everything about the sites linking to your site. We crawl them just as we crawl your site. We see the content they possess and the content you possess. If there is a clear disconnect, the value of that inbound link is significantly diminished, if not completely disregarded.

Authority sites

So what links are valuable? That’s pretty easy, isn’t it? If relevance is important, the most highly regarded, relevant sites are best of all. Sites that possess great content, that have a history in their space, that have earned tons of relevant, inbound links – basically, the sites who are authorities in their field – are considered authoritative sites. And as authorities, the outbound links they choose to make carry that much more value (you don’t get to be an authority in your field by randomly linking out to irrelevant, junk sites). Good SEO practices, a steady history, great content, and other, authoritative inbound links beget authority status. The more relevant, authoritative inbound links you earn for your website, the more of an authority your site becomes in the eyes of search engines. These are the natural results of solid content and smart link building.

Going unnatural

So what does it mean to go unnatural? It means you’re trying to fake out the search engines, to try to earn a higher ranking that the quality of your site’s content dictates as natural through manipulation of search engine ranking algorithms. This chicanery can range from relatively benign but useless efforts to overly aggressive promotion to outright fraud. And as the major search engine bots are continually crawling the entire Web, we see what is being done, the relationships between linked sites, the changes to links over time, which sites link to one another, and so much more, we account for these cunning behaviors in our indexing values applied to those pages.

Examples of potentially conspiratorial hocus-pocus that might be perceived as unnatural and warrant a closer review by search engine staff include but are not limited to:

The number of inbound links suddenly increases by orders of magnitude in a short period of time
Many inbound links coming from irrelevant blog comments and/or from unrelated sites
Using hidden links in your pages
Receiving inbound links from paid link farms, link exchanges, or known “bad neighborhoods” on the Web
Linking out to known web spam sites

When probable manipulation is detected, a spam rank factor is applied to a site, depending upon the type and severity of the infraction. If the spam rating is high, a site can be penalized with a lowered rank. If the violations are egregious, a site can be temporarily or even permanently purged from the index.

Using the Webmaster Center Backlinks tool

Are you curious to see who is linking to your site and how authoritative Bing considers each site to be? Check out the Bing Webmaster Center tools, specifically the Backlinks tool. (If you haven’t yet registered your websites with Webmaster Center, go to About the Bing Webmaster Center tools to learn more.) Once logged in, click the site you wish to review from the Site List page (a webmaster can register multiple sites on one account), then click the Backlinks tool tab. The Page score field associated with each linked page indicates a relative value for that page.

So what can I do to get good, legitimate inbound links?

OK, so you have great content. You built it, and now they will come, right? Well, if you have the patience of Vladimir and Estragon, sure. But sometimes you want to nudge the world a little bit. You want to speed up that process, all the while remaining legitimate in your efforts. You want to actively participate in link building!

I described link building earlier as hard work. But perhaps smart work is a better description. Check out a few of these smart ideas and determine how they apply to your site, your customers, and your industry’s niche. Note that all of these ideas are predicated on the assumption that you’ve already created useful, original, expert content that users will want to read and webmasters of relevant sites will want to link to. That done, let’s spread the news! Here’s how:

Develop your site as a business brand and be consistent about that branding in your content
Identify relevant industry experts, product reviewers, bloggers, and media people and let them know about your site and its content
Write and publish concise, informative press releases online as developments warrant
Publish expert articles to online article directories
Participate in relevant blogs and forums and refer back to your site’s content when applicable (Note that some blogs and forums add the rel=”nofollow” attribute to links created in user-generated content (UGC). While creating links to your content in these locations won’t automatically create backlinks for search engines, readers who click through and like what they find may create outbound links to your site, and those are good.)
Use social media sites such as Twitter, Facebook, and LinkedIn to connect to industry influencers to establish contacts, some of whom may connect back to you (be sure you have your profiles set up with links back to your website first)
Create an online newsletter on your site with e-mail subscription notifications
Launch a blog or interactive user forum on your site*
Join and participate in relevant industry associations and especially in their online forums
Ultimately, strive to become a trusted expert voice for your industry and let people know that your website contains your published wit and wisdom
For more link building ideas, check out these additional link building tips.

* Note that if you do go this route, please keep site security issues in mind. You’ll want to keep a close eye on all UGC, being watchful for possible code injection and malformed links (filter for it if you can). Consider disallowing UGC from unregistered users. Be sure to keep up with web server and application software updates, use applicable security software, require strong passwords, etc. For more information on securing your web server, check out our recent series of security blog articles, “The Merciless Malignancy of Malware,” especially Part 3 and Part 4.

When you do develop these new ideas, be consistent in your work. Nothing can kill momentum like over-commitment early on and under-delivery later on. Assess what you can realistically do on an on-going basis, commit to it as part of building your business, and do it consistently and with high quality. You want to develop both an online library of worthwhile content and a reputation for the regular delivery of new material. That is what draws initial visitors and keeps them coming back.

If you have any questions, comments, or suggestions, feel free to post them in our SEM forum. See you again soon…

by Rick DeJarnette, Bing Webmaster Center

How To Create Good Meta Descriptions?

How to create good meta descriptions? Why? Google has just released the detail guide about this topic. We would like to share this useful information with our site visitors. We do hope it will help webmasters to have a more friendly site with Google to dirve more website traffics, more business.

The HTML suggestions page in Webmaster Tools lists pages where Google has detected missing or problematic meta descriptions. (To see this page, click Diagnostics in the left-hand menu of the site Dashboard. Then click HTML suggestions.)
Differentiate the descriptions for different pages. Using identical or similar descriptions on every page of a site isn’t very helpful when individual pages appear in the web results. In these cases we’re less likely to display the boilerplate text. Wherever possible, create descriptions that accurately describe the specific page. Use site-level descriptions on the main home page or other aggregation pages, and use page-level descriptions everywhere else. If you don’t have time to create a description for every single page, try to prioritize your content: At the very least, create a description for the critical URLs like your home page and popular pages.
Include clearly tagged facts in the description. The meta description doesn’t just have to be in sentence format; it’s also a great place to include structured data about the page. For example, news or blog postings can list the author, date of publication, or byline information. This can give potential visitors very relevant information that might not be displayed in the snippet otherwise. Similarly, product pages might have the key bits of information – price, age, manufacturer – scattered throughout a page. A good meta description can bring all this data together. For example, the following meta description provides detailed information about a book.

In this example, information is clearly tagged and separated.

Programmatically generate descriptions. For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions can be impossible. In the latter case, however, programmatic generation of the descriptions can be appropriate and are encouraged. Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation. Keep in mind that meta descriptions comprised of long strings of keywords don’t give users a clear idea of the page’s content, and are less likely to be displayed in place of a regular snippet.
Use quality descriptions. Finally, make sure your descriptions are truly descriptive. Because the meta descriptions aren’t displayed in the pages the user sees, it’s easy to let this content slide. But high-quality descriptions can be displayed in Google’s search results, and can go a long way to improving the quality and quantity of your search traffic.

Good luck to your online business.

Learn “How to drive more website traffic” here.

Google Design and content guidelines for webmaster

Google is the king of search engines. So Google design and content guidelines are very important for every webmaster. It could effect directly to their website ranking, traffic in Google search results. Bellowing are the guidelines from Google:

– Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

– Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.

– Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

– Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

– Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images. If you must use images for textual content, consider using the “ALT” attribute to include a few words of descriptive text.

– Make sure that your <title> elements and ALT attributes are descriptive and accurate.

– Check for broken links and correct HTML.

– If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

– Keep the links on a given page to a reasonable number (fewer than 100).

Domain Hosting Service

The Web Hosting product family provides hosting solutions for users on all levels; from basic shared hosting plans to high-performance dedicated servers that allow users to take complete control of their hosting needs. The following overview will acquaint you with the available offerings in the Web Hosting product family.

The Web Hosting product family includes the following offerings:

  • Web hosting
  • Dedicated servers
  • Virtual dedicated servers

Product Overview

Our robust and feature-rich hosting plans offer generous disk space and data transfer quotas, free email and email forwarding, MySQL and Access database support, Web site statistics, and around-the-clock upload availability and technical support. All Web sites are hosted in our secure, monitored world-class data center.

Hosting plan overview:

  • Economy Plan – 5GB disk space, 250GB monthly data transfer, 500 email accounts, 50 email forwarding accounts, 10 MySQL databases, free value applications.
  • Deluxe Plan – 100GB disk space, 1000GB monthly data transfer, 1000 email accounts, unlimited email forwarding accounts, 25 MySQL databases, SQL Server 2000 (Windows-based plans), free value applications.
  • Premium Plan – 200GB disk space, 2000GB monthly data transfer, 2000 email accounts, unlimited email forwarding accounts, 50MySQL databases, unlimited Access databases, SQL Server 2000 (Windows-based plans), free value applications.

All plans are available with choice of either Windows or Linux operating system.

Customer Benefits

  • Free setup.
  • Peace of mind knowing site is monitored, protected and secured, 24 x 7, with daily backups.
  • Web site is hosted in our world-class data centers, with best-of-breed routers, firewalls, and servers.
  • Generous disk space and data transfer limits, with 24 x 7 FTP access.
  • 24 x 7 email, telephone, and Web-based tech support.
  • Web site statistics.
  • Free access to value applications.

How it Works

Hosting accounts are available to suit the needs of any business or individual.

Dedicated Servers

Product Overview

A dedicated server grants its user exclusive rights to the server’s bandwidth, memory and storage space, and complete control of server usage and software installation through admin (root) access to the server. This means that the server can be used for a virtually unlimited variety of purposes, including gaming, database management, and hosting and management of multiple traffic-intensive Web sites. Dedicated servers are particularly useful for companies and individuals that run very-high-traffic Web sites or applications.

Dedicated servers are available with the following operating systems:

  • Linux: Red Hat Fedora Core 6 / Red Hat Enterprise 4 / Cent OS 4
  • Microsoft: Windows Server 2003 Web Edition / Standard Edition

Customer Benefits

  • Free rapid setup.
  • Complete control over server space.
  • Install and run virtually anything on server through admin (root) access.
  • Manage multiple Web sites on a single server.
  • 24 x 7 FTP access.
  • 24 x 7 email, telephone, and web-based technical support.
  • 24 x 7 monitoring.
  • 24 x 7 physical security.
  • Best-of-breed routers and servers.

How it Works

When the user purchases a dedicated server, he/she actually leases a server box, which will be configured and set up according to his/her preferences, but remains at our data center. A dedicated server account provides the user with a dedicated IP address, full control of server usage and software installation, with admin (root) access to the server.

Users may customize their servers upon purchase.

Virtual Dedicated Servers

Product Overview

Virtual dedicated servers offer many of the capabilities and features of dedicated servers, including admin (root) access and dedicated IP addresses, but at a much lower price. A virtual dedicated s server is the perfect, affordable solution for users that wish to operate multiple Web sites, traffic-intensive Web sites, gaming servers, or simply want unlimited control of their server space through root (admin) access. Users share a server, but because each virtual dedicated server is effectively isolated from other accounts, the user has full control over the server space. Our virtual dedicated servers are available with choice of Windows or Linux operating systems.

Customer Benefits

  • Complete control of server space.
  • Generous bandwidth and disk space.
  • Free rapid setup.
  • Similar capabilities to those of dedicated servers, such as managing multiple Web sites on a single server, but at much lower price.
  • 24 x 7 FTP access.
  • 24 x 7 email, telephone, and web-based technical support.
  • 24 x 7 monitoring.
  • 24 x 7 physical security.
  • Best-of-breed routers and servers.

How it Works

Servers are partitioned in such a way that each virtual dedicated server operates almost entirely independently of the other accounts that share the same server space. With a virtual dedicated server, users will enjoy consistent, high performance even when usage peaks on the main server.

•     Users may customize their servers upon purchase.

Domain Name Registration Services

The Domain Name Registration product family addresses what will likely become the core of your business as the products enable your customers to take the crucial first step in establishing a Web presence: Finding and securing a domain name for their Web sites. The following overview will acquaint you with the available offerings in the Domain Name Registration product family.

The Domain Name Registration product family addresses what will likely become the core
of your business as the products enable your customers to take the crucial first step in
establishing a Web presence: Finding and securing a domain name for their Web sites.
The domain name registration product family includes the following offerings:
  • Domain name registrations and transfers.
  • Private domain registrations (via Domains By Proxy®).
  • Domain backordering (DomainAlert® Pro
Domain Name Registrations and Transfers
Product Overview
Domain name registration is the process of reserving the rights to an Internet domain for a predetermined amount of time. Domains are generally registered one—10 years. A domain transfer moves the registration of a domain name from one registrar to another. The following domain extensions (i.e., top-level domains (TLDs)) are currently available
for registration and/or transfer:
.biz .com .info .jobs .mobi.name .net .org .tv.ws .am .at.be .cc
Country-Code TLDs
.cn/.com.cn/.net.cn/.org.cn    .co.nz/.net.nz/.org.nz    .co.uk/.me.uk/.org.uk    .tw/.com.tw/.idv.tw/org.tw    .de .eu .fm .gs .jp .ms .nu .nw .tc .tk .us .vg
Customer Benefits
  • The first step in establishing a Web presence.
  • Large variety of available TLDs.
How it Works
The domain name registration process begins with a search for available names. Once a domain has been registered or transferred an intuitive interface allows the registrant to perform all domain-related maintenance functions.
Private Domain Registrations
Product Overview
Private registration allows a domain name registrant to keep his/her personal information out of the Whois database, thus shielding the registrant’s identity from the public eye. Private domain registrations are managed through Domains By Proxy®, which is also the entity whose contact information will appear in the Whois database for the privately-
registered domain. Your customer retains all rights to the domain registration; Domains By Proxy simply serves as the intermediary (or “proxy”) between the domain registrant and any outside correspondence.
Customer Benefits
  • Registrant’s identity not available to the public.
  • Registrant’s personal information cannot be captured and exploited.
  • Shields registrant from Whois database mining (i.e., the process of scouring and
  • collecting data for personal or commercial purposes).
  • Patent-pending email forwarding system filters domain-related spam.
How it Works
During the domain name registration process, the customers may choose to make the registration private. The Domains By Proxy Web site allows private-registration customers to modify account settings, track domain-related correspondence, etc.
DomainAlert® Pro
Product Overview
DomainAlert® Pro – Monitoring, Backordering and Investor’s Edge – is a family of products that helps users keep close watch over currently-registered domain names, providing timely and invaluable information on domain status, as well as securing the inside track on registering soon-to-expire domain registrations.
DomainAlert Monitoring alerts the user if and when a monitored domain becomes available and keeps him/her informed about changes to the domain’s
Whois information and registration period. DomainAlert Monitoring service can be applied to any domain name, regardless of its registrant and registrar.
DomainAlert Backordering allows the customer to “backorder” a domain, meaning that DomainAlert will attempt to register the domain on the user’s behalf as soon as its registration expires and the domain becomes available. DomainAlert Backordering includes DomainAlert Monitoring capabilities, as well.
DomainAlert Investor’s Edge helps the customer by maintaining a continually updated list of expiring domains from which the customer can select the one(s) he/she wishes to register upon expiration.
Customer Benefits
  • Best chance to register a desired, soon-to-be-expiring domain name.
  • Constantly monitors the expired-domain market.
  • Reports timely, accurate information.
  • Provides all the tools needed for speculation in the domain name market.
How it Works
Customers can choose between DomainAlert Monitoring, Backordering, and Investor’s Edge.