Archive for October 2011
The key driving factors to Web 3.0 marketing include browsing habits, browsing methods, more intelligent information, the experience you are looking for, and the openness of the Web. Web 3.0 marketing is the convergence of new technologies and rapidly changing consumer buying trends.
Live, streaming video is outpacing static video, and companies like Twitter, Plurk, and Jaiku are growing much more
rapidly than Blogger, WordPress, or TypePad. The Web 3.0 marketing world is where customized, intelligent information
is available at our fingertips, on any device, from anywhere in the world.
Components of Web 3.0 Marketing
Microblogging: is the ability to share your thoughts with a set number of characters. People are busy with limited time, so why not get right to the point of the story in 140 characters or fewer? Examples include Twitter, Plurk, and Jaiku.
Virtual Reality Worlds: are places users visit to interact with others from around the world in a 3-D setting. Meetings are being conducted in these spaces, and trade shows are being replaced with virtual reality shows. Examples include Second Life and Funsites.
Customization/Personalization: allows visitors to create a more personalized experience. They are starting to expect their name to appear at the top of Web sites, personal e-mails, and even advanced checkout options that suit their buying habits. As the Web becomes more and more intelligent, personalization will be the norm. Examples include SendOutCards, Google, and Amazon.
Mobile: plays on the fact that there are billions of cellphone users throughout the world. This number is much larger than those that use PCs. Consumers are surfing the Web and purchasing products right from their mobile phones. They are also using their phones and becoming instant journalists by shooting raw footage of random acts. Examples include iPhones and BlackBerrys.
On-Demand Collaboration: allows users to interact in real time by looking over documents, collaborating,and making changes in real time. Software as a service also fits into on-demand collaboration as it allows users to leverage only Web-based solutions. Examples include Google Docs, http://www.Salesforce.com, http://www.Slideshare.net, and http://www.Box.net.
Web3.0 Marketing Technology
-A system to send voice broadcasts to mobile phones, and one that will also send SMS (text messages). (www.trumpia.com).
-A Web-based customer relationship management system. Salesforce.com.
-An all-in-one solution for capturing leads, managing sales, and garnering affiliates. http://www.amazingshoppingcart.
com is a great place to start.
-A solid team (or individual, to start), whether in-house or outsourced, that knows programming. This person should be able to develop applications, be able to work with open-source code, and ideally know how to program in Second Life. Check out http://www.RentACoder.com for ideas.
-platform to virtually communicate and collaborate across a company. My all-time favorite company is Google. For instance, Taz Solutions, Inc. company.
Analyzing and understanding heat maps has numerous benefits. The data is a proof as to where visitors click or do not click, providing useful information when designing landing pages. Data is also helpful in determining optimal advertisement placement, reduce abandonment of shopping cart, maximize conversions of online forms and predict how visitors will use the site.
In traditional heat maps, the brighter the colour the more clicks a specific area is receiving. This is effective as it shows the degree to which people are interacting with live elements on a website and how a design and site structure can be improved by understanding what elements users are truly interested in, as opposed to those areas where users only consider clicking.
But heat maps alone do not show us all of the information needed for true optimization, as they tend only to track clicks on links. Coupling heat map data with more detailed analytics lets us track and visualize mouse movement and page or mouse clicks that are not links (non-clickable elements). In essence, we get a deeper understanding of how users are experiencing a page as a whole, not just where they click. How helpful would it be to know if users are hovering over links (often called hover time), and how often? If that information were available we might be able to determine how compelling the anchor text of certain links are, even if users did not actually click them, and how they might be improved.
We can also determine the time from when a page loads until a user clicks a certain link. This is helpful in determining if the placement we’ve chosen for a specific design element is optimal or if it should be brought into greater focus (above the fold, for example) to increase the number of clicks.
What heat map analytics can show with great clarity is not just how well a Web design, its layout and structure are performing in terms of clicks but also help us make modifications based on seemingly unrelated information, such as the referrer. For example, review the highest volume entry pages by comparing the best and worst performing pages. Then use heat maps to determine the relationship between the top referrer of those pages and clicks and lack of clicks. Once the worst offenders are found, multiple variables and landing page approaches based on the source of traffic can be tested.
Reviewing the activity of visitors from different referral sources is but one of the ways heat maps can be used. Clicktale, for example, offers statistics to its users based on existing customers versus new visitors, and customers who made a purchase versus those who did not. You can segment by absolutely anything. For example, age, gender, location or compare specifics such as bachelors in their 30’s versus mothers versus teenagers.
Analyzing data provided through heat maps can be cumbersome but keep these points in mind during analysis and when
drafting suggestions and recommendations:
–Areas that receive few clicks could be removed. If users don’t find them important they may be more of a distraction than providing help. If you’re hesitant to leave sections empty, simply replace them with something entirely different to see if that leads to more interest on the part of users.
–While identifying areas that receive the most attention is useful, particularly for those responsible for optimizing content,observing the areas after they click to those destinations is perhaps even more useful. For example if users do not click anywhere after arriving on a page they may have hit a dead end. Try to turn those endings into new beginnings by offering content suggestions or product recommendations.
The job of a Web designer — or anyone responsible for optimizing the user experience — is to maximize interest in the site and its products or content while maintaining usability. The challenge is to balance aesthetics with function. To achieve this you must give website visitors a direction; guiding them where you want them to go.
Google TrustRank is the degree to which Google trusts that your website will be valuable to visitors if presented as a search result. Google will place your website highly in search results if it has a good TrustRank . TrustRank is earned the same way as PageRank: is by receiving links from other sites. The age of a site also increases its TrustRank.
The PageRank found on the Google toolbar, is updated only every two to three months, so the PageRank you see today could be very different from the page’s actual PageRank, which is tabulated daily by Google.
Google penalizes websites that sell links by crippling their ability to transfer TrustRank and they do not inform penalized websites that they have lost the ability to pass TrustRank to other sites. Most of the links that are sold today are from penalized pages and have absolutely no value to their buyers.
-Google gives the most TrustRank to sites that have links from well linked web pages. -Google does not allow sites that sell links to pass trust but shows no indication of the sites that have been disallowed to pass trust. -A site has a high TrustRank if its links are from websites that, to the best of your knowledge, have never sold links.
If a site looks professional, has been around for sometime, and doesn’t have anything spammy written on it, it is likely to be in good standing with Google and will transfer TrustRank properly. If you search for a particular term in Google , the top 40 results definitely have TrustRank and the top 10 results definitely have a lot of TrustRank. Remember to search for competitive terms. Do not search uncommon terms even the top 5 results may not have a lot of TrustRank.
Beacuse Search engines is a high-volume search and these are the top 10 results for it, we can immediately feel certain that each of the 10 results has a good amount of TrustRank. Therefore, any of them would be good targets to approach about acquiring a link on their sites.
Now lets try a less common search: search engine marketing. Here to the top 10 results have quite good TrustRank. When you see a lot of ads come up around a search, it usually means that the search seemed worthy enough to other businesses that they were willing to invest money in it. That’s a sign that the search is competitive and the top 10 results probably had to earn their spots on the first page with a healthy amount of TrustRank.
Lets try to search for a much less competitive phrase: search engines list. Because the search is obviously not an especially common one, these sites do not have enough TrustRank to approach for getting links.
Try to get a lot of websites to link to your website that
–Have many inbound links
– Have never sold links
This will give your site TrustRank and cause Google to send it traffic.
When we search the internet many of us will use Google, the most popular search engine. Google is not necessarily the only way to find things on the internet, or the best. Very often the information Google displays will not include what you are looking for. When it’s important to find the best information on the internet the trick is knowing where to start looking, and using other search engines.
Google became successful and the reason the results aren’t as good as they should be, Google is working out how useful a site is. Lets take an example, there is a popular website in a specific niche and lots of people link to that site then Google thinks it must be fairly authoritative one and that deserves to go near the top of the list of search results when people search that site on a particular keyword. This new way of deciding which sites to list first, and the indexing of as much of the internet as possible, put Google ahead of it’s competitors.
This popularity led to a lot of people asking owners of other sites to link to their own, or setting up ‘link farms’ where lots of sites link to each other, trying to boost the ranking of particular ones. That is why spammers sign up to forums and never write anything, instead listing their own site address in the member profile. This is why you sometimes find results on Google aren’t so relevant to your search.
There are other search engines that can be useful. More importantly, there are many specialized search engines that deal on a particular type of information. Most modern web browsers have a search box and you can usually choose which search engine they use.
What is the question? Different search engines will provide different results, and organize them differently. Google place Wikipedia at the top of the list and has pages of information to go through. Answers.com provides a long list of information that includes Wikipedia, but also reputable sources of information such as Britannica. Bing provides categories such as Biography and Family tree, which can hep find the right information. For a good overview of facts Answers.com provided the best result.
When the question is really a question, it is worth typing the whole phrase into a search engine. Google will give plenty of solutions; answers.com will give a single answer. Ask.com will come out with similar results to Google, and Bing will provide the least useful results.
Bing can be very useful if you are searching for a company. It will display the sponsored results and UK customer services number, quick links to most important parts of the UK website and a box to search within that site.
Google has an advanced search option, and it is possible to restrict a search to certain sites. This feature is provided by most search engines.
Search Images: Google and Bing have similar image search options. Type the words into their image search tools and it will display a list of preview pictures that can be clicked to see full-sized versions. It is possible to narrow down the results by size, colour and other options. For photos for your website visit photo sharing site flickr for images that are not copyrighted. Choose ‘Advanced search’ and tick the box ‘Only search within Creative Commons Licensed content’. An additional option finds images suitable for commercial use. Flickr is also useful to find images taken in a particular place, since it supports ‘geotagging’ where images can have their geographical location embedded in them.
Search Moving Pictures:Clicking videos in a list of Google results will open a new page on it’s video site Youtube, so you have have to switch between sites. Bing will play a video when the mouse hovers over it. There are links on the left side of the page allow certain sizes or quality of videos to be shown or from specific sources such as Youtube.
Right Price: Search engines can be used when you are looking to buy a product online and searching for the best price. There are many price-comparison sites, but ordinary search engines can help. Google and Bing have a ‘shopping’ link on their front page that will help you find the best prices on a product. Search for a specific product and Bing will give links for reviews, support and prices making it easy to find the right information, the shopping link list retailers and their prices. Google’s shopping home page lists things other people have searched for. Bing’s home page allows you to browse categories.
Map Service: Electronic maps can be very helpful. It can be used to find an address, see a satellite view of an area or plan a route between two locations. Bing has its own map service, you can type in a postcode to find the nearest station. You can also plot a route between points is simple and dragging a route with the mouse can make it go via specific places.
Google maps is useful for finding businesses- type in an address or a query. In Bing you will have to click the Find businesses link.
SEO Translation is localising a site to make it as visible as possible in the target language and culture and achieve higher rankings in search engines.
Companies grow by extending their product line, another way is to expand their service line to include more geographic regions. Doing so presents several challenges for these businesses and search engine optimizers who server them.
There is a big difference between making a site accessible in multiple languages and taking a business to new regions. It is useful to consider the basic organization and thoughtfulness toward how users in languages other than your own react to content and design; much more is required for those going multiregional.
Multiple Languages: Managing multiple language versions of a website and making sure localized content appears in search results pages is straightforward, it is just like optimizing a site. When it comes to leveraging translated content for SEO, suggestions include making sure the page language is obvious, each language is discoverable and paying attention to URL naming.
Search engines use content of the page and navigation as primary signals to determine the language of the page. So, the page content and navigation should be accurately translated. Researching language-specific keyword search volumes will ensure the terms you are using are those that provide the most value to your users and your overall SEO efforts. Another thing to remember is several dialects can be in use in the same region.
The ability to separate the site into languages or regions if similar languages, will enable the creation of language specific sitemaps that, in turn, enables search engines to discover more of the site. Interlinking the various languages will also provide search engines with cues that additional content is available for indexing. It is better to have a dual language speaker to translate content, automated content translation always doesn’t make sense.
Local Sense: Initial reaction may be to purchase as many relevant country-code top-level domains (ccTLDs) or internationalized domain names (IDNs) as possible, the acquisition requirements are too demanding and the investment too costly. ccTLDs and IDNs provide a strong signal to users and search engines that the site is explicitly intended for a specific country.
Most business are using subdomain or subdirectory for translated content. Example, instead of domain.in, we can use in.domain.com. One has to find out how different regions or countries abbreviate their individual languages.
While server location is a signal to search engines about a site’s intended audience, it is in no way definitive, as many websites use distributed content delivery networks or are hosted in a country (not the one being targeted) with a better infrastructure. Consider mapping a subdomain that includes translated content to a Web host in that particular regional area.
Search engines do provide a way to designate that a site is intended for a specific country. Google Webmaster Tools provides geo-targeting capabilities – all that is required is to select the appropriate country. This feature can only be used for sites with a generic top-level domain however, such as .com or .org. Sites with country-code top-level domains such as .in are already associated with a geographic region. If no information is entered in Google Webmaster Tools, Google will make geographical associations based on top level domain (.com) and IP address of the web server from which the context was served.
The best way to inform user and search engines that a website is intended for a geographic area and a language, is to be local. Use addresses and phone numbers , acquire links from local sites, and set up local profiles through Google Places.
Avoid certain with regard to site structure or page names. Example, stay away from URL based parameters such as yoursite.com?loc=in. Location based meta tags or HTML attributes are rarely used for geo targeting.