Wednesday, 21 October 2009

More angles on fast Google Indexing

Now if you're still not indexed on Google and Yahoo there are a couple more tricks.
The first one is again using the popularity of the social networking sites and the interest taken in them from the search engines.
So quite simpler create yourself a Digg account, if you haven't got one already and put on a post which is referenced back to your URL. This will at attract the robots to follow the link and start the indexing process for your site. Also if you post your Digg in a good category you might get some traffic for free!!
I'll demonstrate the next one live with one of my new sites Classical Chandeliers.
So first make reference to some key words and provide a brief marketing reference as below:-

Classical Chandeliers sell exquisite chandeliers to the public and trade offering outstanding value.
Add some Keywords: Chandeliers, Crystal, Bohemian, Lighting.

Put in the URL in XHTML code :-
eCommerce Chandelier Shop
created by Towers IT PS.


Then go to 'subcribe to posts Atoms'. If you are using firefox you can simply subscibe to this feed using Yahoo. If you are using IE then click and pick up the Atoms URL. In this case:
http://www.towersitservices.co.uk/blog/atom.xml
Go to My Yahoo and add RSS feed and add the atom URL.
You now have the connection from Yahoo to your blog and back to the website you are launching and a made another step along the SEO yourney.

Monday, 19 October 2009

Google Indexing... Is it a science or black art?

As all webmasters know it is important to get a new website indexed on Google quickly. You can focus on a multitude of other search engines as well but as Google dictates the majority of the traffic then you should concentrate there.
So step 1. Create the site and submit the URL to Google.
Step 2. Create yourself a Google account. In here get the site verified by adding the required meta tag into the header and create a sitemap. Google and the other search engines love sitemaps as they give a precise structure and inform the robots where to look.
Step 3. If possible get yourself an inbound link from a well respected site with a google page rank. This site will be hit by the robots regularly and redirect the robots down the link to yours.
Step 4. Utilise the social networking sites like twitter which bet crawled regularly by Google and put a link back to your URL. You can use tiny URL here so as an example the Towers IT eCommerce site http://www.classicalchandeliers.co.uk is reduced to http://tinyurl.com/yge5e7v. Just search on google for Tiny URL and go. Also you can create a blog on a well respected site with a link back to your URL which gives a high probability that the robots will venture down that link.
So now you are complete and depending on the success of your actions the indexing can take anywhere between a few hours and several months.
Right so when you are indexed you can then focus on the SOE activities to imporove your rankings. In summary this entails:-
1. Authoring good and unique content.
2. Good use of keywords in tags and content.,
3. Acquire good quality inbound links from sites with a good page rank and similar content to yours. This not only increase page rank but also will result in inbound traffic through the links. provide a good net.
4. Get listed on quality directories.



eCommerce Chandelier Shop
created by Towers IT PS.

Sunday, 4 October 2009

The Future of the Web.

We've been through the Web 1.0 era where web sites were newspapers and magazines displayed on the computer screen to market companies products or as useful information sources for organisations products.
Things have now advanced and with Web2.0 we now have rich sites facilitating information sharing across communities which provide attraction to users to contribute and therefore there future is secure. There are a multitude of examples across wikis, blogs and social networking sites where the new technological advancement has been matched to a business need or burning desire.
The enabling technologies provide improved user experience through client side tools ( eg. Ajax, Adobe Flash, Flex and Javascript) combined with server side tools (eg. ASP.NET, PHP and Java)
But where is it all going now?
Well, certainly the advancements in artificial intelligence and the semantic web will contribute significantly to improved search engines and improved aggregation of data. But this will not be constrained to the search engines. Currently job board aggregators and price comparison sites need to develop contractual business relationships with companies to aggregate offer rings and provide them to the consumer in a single portal. These advancement will negate the need for these contractual business relationships and enable anyone to set up price comparison or a job board aggregation capability!
This openness will be further enabled by a transformation in the way companies offer their products on the web. We have seen a successful transformation to open source software with great tools like linux,PHP and mySQL and this is likely to be followed by companies not just providing web sites but actual web services that consumers can use on demand. Through this transformation which is a part of cloud computing we will see far more accurate and real time aggregation of data into forms in consumers personalised formats.
The web is a mine of useful information but clouded with overload. Consumers time is at a premium and this is where the ' Attention Economy' concept comes in. Consumers will agree to opt in to services in exchange for their attention. This is a complex issue to be grasped and managed by service operators who need to strike the balance between offering too many services and not getting enough attention , which can be transformed to business advantage, and offering too few services and missing out on potential business opportunities. It will also place a high demand on service providers giving high quality and wanted services otherwise consumers will opt out. For sure the successful jugglers in this business equation will be the winners.
One great vision is to integrate services closely with the most rel event device. Obvious examples are to have your navigation computer in your car. But to extend this concept why not have a larder which orders new food supplies when empty, a fridge which re-orders milk and your medicine cabinet that puts in repeat prescriptions to your doctor!

The Semantic Web - fact or fiction?

The web is a vast repository of information, some useful, some invaluable and some incorrect. This information is generally retrieved through search engine or aggregated by applications to provide information pertinent to the task in hand. The issue is that the whole process of interpreting information from a multitude of sources needs human intelligence to accurately interpret and aggregate data to provide concise and useful information. This is where the computer based searches fall down! The vision of the Semantic Web is to provide the capability for computers to understand information in similar ways to the human brain. The steps towards this typically involve using Resource Description Framework (RDF) to classify data and then utilises networking formats eg. (RDFXML, N3, Turtle etc.) to assimilate the information.
The approach is not new but uses principles from Artificial Intelligence (AI) work to understand data, based on the science of neurology and how the human brain thinks. Now there have been some significant advances here, for example, Numenta have developed Hierarchical Temporal Memory (HTM) algorithms to interpret the way the brain works to solve complex problems and 'Fuzzy Logic' techniques to accurately interpret information.
So the science is there and there are leading practitioners, but it still requires significant advancement for mainstream adoption.
...... But the for the successful integrators the rewards will be high.