A Visual Playground

about me(websilt my portfolio)

Vijay, located in New Delhi, is the co-owner of Websilt, along with you, the clients of  Websilt, located in your offices throughout the world. He is the visible face, obviously, because he is a full time hands on web design and development professional working for Websilt, and again, Websilt in turn is working for you.

Websilt is a full-service Web site design and Internet marketing service dedicated to affording you with attractive design and bang on marketing of Web sites. It’s a young company (with time and energy on its side). Vijay started his career almost four years ago, and has been providing cutting edge web design services ever since. Vijay is an M.Sc (IT) from Sikkim Manipal University and Bachelor of Arts (Honours) from Delhi University.

Websilt has emerged from the professional experience of Vijay attained from serving in successful companies over the years. He has a strong desire to strike out on his own, which along with his passionate interest in Internet technologies have helped the companies he has served as well. While working on your projects, he manages to communicate with you without using technical jargon, and that’s why he ends up with a better understanding of your requirements and specific problems being faced by you.

Vijay specializes in creating web pages that are compatible with all the browsers and and load faster, thus driving the message home with minimal fuss. Right now he is concentrating on keeping the overheads low which results in lower costs for you. It doesn’t matter where you are located, the level of service remains as personalized as it gets.

Last but not the least, Vijay looks at every website created as a part of his portfolio, and that’s the reason he feels the same way as he looks at his own website. Need we say more?

Interested in Me. Liked my work. Want me to work for you.
Write to me at

What Should I Do With My Life?

The real meaning of success — and how to find it.

It’s time to define the new era. Our faith has been shaken. We’ve lost confidence in our leaders and in our institutions. Our beliefs have been tested. We’ve discredited the notion that the Internet would change everything (and the stock market would buy us an exit strategy from the grind). Our expectations have been dashed. We’ve abandoned the idea that work should be a 24-hour-a-day rush and that careers should be a wild adventure. Yet we’re still holding on.

We’re seduced by the idea that picking up the pieces and simply tweaking the formula will get the party started again. In spite of our best thinking and most searing experience, our ideas about growth and success are mired in a boom-bust mentality. Just as LBOs gave way to IPOs, the market is primed for the next engine of wealth creation. Just as we traded in the pinstripes and monster bonuses of the Wall Street era for T-shirts and a piece of the action during the startup revolution, we’re waiting to latch on to the new trappings of success. (I understand the inclination. I’ve surfed from one boom to the next for most of my working life — from my early days as a bond trader to my most recent career as a writer tracking the migration of my generation from Wall Street to Silicon Valley.)

Read the full of http://www.fastcompany.com/magazine/66/mylife.html

Facebook overhauls search as it crosses 400 million users

On its sixth birthday, Facebook launched a host of new features as it crossed the 400-million user mark.
The most interesting of them may be its revamped search. When you type in names, it auto-completes for people who are the closest to you by social promixity — e.g. the people you share the most mutual friends with. Not only that, Facebook CEO Mark Zuckerberg says that search indexes content like Pages and Applications two degrees out in your social graph. That means your friends (one degree) or your friends’ friends (two degrees).
You’ll also have the option of seeing more content like status updates and posts through search, as long as it’s supposed to be visible to you through privacy settings. With the new privacy settings and more public content, search should figure increasingly prominently in the user experience, as it will be able to surface more and more content over time.
There is also a new photo-uploader accessible from the front page that makes it faster and easier to post new photos to the site. There’s also a games and applications dashboard, which replaces older game notifications. We’ve covered it more in-depth here.

Read the Full Story at VentureBeat

Google Apps browser support

In order to continue to improve Google products and deliver more sophisticated features and performance, Google is harnessing some of the latest improvements in web browser technology. This includes faster JavaScript processing and new standards like HTML5. As a result, over the course of 2010, Google will be phasing out support for Microsoft Internet Explorer 6.0 as well as other older browsers that are not supported by their own manufacturers.

Google plan to begin phasing out support of these older browsers on the Google Docs suite and the Google Sites editor on March 1, 2010. After that point, certain functionality within these applications may have higher latency and may not work correctly in these older browsers. Later in 2010, Google will start to phase out support for these browsers for Google Mail and Google Calendar.

Google Apps will continue to support Internet Explorer 7.0 and above, Firefox 3.0 and above, Google Chrome 4.0 and above, and Safari 3.0 and above.

Starting this week, users on these older browsers will see a message in Google Docs and the Google Sites editor explaining this change and asking them to upgrade their browser. Google will also alert you again closer to March 1 to remind you of this change.

In 2009, the Google Apps team delivered more than 100 improvements to enhance your product experience. Google is aiming to beat that in 2010 and continue to deliver the best and most innovative collaboration products for businesses.

HTML Best Practices for all

Without further argument, let’s review best practices to examine when creating your markup.

1. Always Close Your Tags

2. Declare the Correct DocType

3. Never Use Inline Styles

4. Place all External CSS Files Within the Head Tag

5. Consider Placing Javascript Files at the Bottom

6. Keep Your Tag Names Lowercase

7. Use H1 – H6 Tags

8. Never Use Inline Javascript

9. Bind Navigation with an Unordered List

10. All Images Require “Alt” Attributes

11. Use a CSS Reset

For Instance

  1. html, body, div, span,
  2. h1, h2, h3, h4, h5, h6, p, blockquote, pre,
  3. a, abbr, acronym, address, big, cite, code,
  4. img, ins, kbd, q, s, samp,
  5. small, strike, strong,
  6. dl, dt, dd, ol, ul, li,
  7. fieldset, form, label, legend,
  8. table, caption, tbody, tfoot, thead, tr, th, td {
  9. margin: 0;
  10. padding: 0;
  11. border: 0;
  12. outline: 0;
  13. font-size: 100%;
  14. vertical-align: baselinebaseline;
  15. background: transparent;
  16. }
  17. body {
  18. line-height: 1;
  19. }
  20. ol, ul {
  21. list-style: none;
  22. }
  23. blockquote, q {
  24. quotes: none;
  25. }
  26. blockquote:before, blockquote:after,
  27. q:before, q:after {
  28. content: ”;
  29. content: none;
  30. }
  31. table {
  32. border-collapse: collapse;
  33. border-spacing: 0;
  34. }

IE Bugs and Fixes

I have listed some common Internet Explorer bugs and solutions. I think this will able to help you condense the times spent on debugging the layout inconsistencies in IE.

Double Margin Bug Fix

If we have floated elements with margin left and/or right assigned, IE6 will double up the margin. For instance, margin-left:5px will become 10px. You can resolve that by adding display:inline to the floated element.

.wrapper {
display:inline;  /* fix the double margin error */

Differences between Google Page Rank and Alexa Page Rank

Differences between  Google Page Rank and Alexa Page Rank

Google Page Rank and Alexa Rank are the two most general measuring tools by webmasters on the web. But, Google Page Rank and Alexa Rank are very much dissimilar. Here are some differences between both of them exposed.

1. Google’s Pagerank is a calculate of a page’s importance based on number and quality of incoming links to your website while Alexa Rank is computed based on traffic recorded to your website as calculated through the Alexa Toolbar.

2. Google Page Rank is only published 3-4 times a year while in Alexa ranks your website by current traffic.

3. Google Page Rank is not frequently updated as in the case of Alexa. So sometimes Google page rank may appear outdated.

4. Google Page Rank don’t require any toolbar installed on your browser but Alexa is crippled without toolbar.

5. Google Page Ranking mechanism can’t be mislead while Alexa page rank can be fictitious by installing the alexa toolbar to many computers and reloading the page frequently.

6. Google Page Rank is shown as picture while Alexa Tool Bar will show rank as Numbers.

7. Google Page Rank is calculated from 1 -10. Bigger is Better while Alexa Rank is calculated from Millions – 1. Smaller is Better.

How to use Robots.txt

Robots.txt file is a file placed in your main directory and concerns commands to crawler visiting your site. The significance of a robots file can mean certain pages/sections can be “crawled” or not crawled depending on the issues given.

Using a Robots File efficiently

In general we wish for as much as exposure as possible to our sites, but there some content that you don’t want indexed and listed on search engines. This is where a robots.txt can be used effectively.


User-agent: this parameter defines, for which bots the next parameters will be valid. * is a wildcard which means all bots or Googlebot for Google.
Disallow: defines which folders or files will be expelled. None means nothing will be expelled, / means everything will be expelled or /folder name/ or /filename can be used to specify the values to expelled.
Allow: this parameter works just the opposite of Disallow. You can mention which content will be allowed to be crawled here. * is a wildcard.
Request-rate: defines pages/seconds to be crawled ratio. Example, 1/20 would be 1 page in every 20 second.
Crawl-delay: defines how many seconds to wait after each successful crawling.
Visit-time: you can describe between which hours you want your pages to be crawled.
Sitemap: this is the parameter where you can show where your sitemap file is (You must use the complete URL address for the file).


This the robots.txt We can use on our site:

User-agent: *
Disallow: /cms/feed/
Disallow: */feed/*
Disallow: /feed
Disallow: /cms/wp-content/
Disallow: /cms/wp-plugins/
Disallow: */wp-content/*
Disallow: /cms/wp-content/plugins/
Disallow: /cms/index.php
Sitemap: http://www.bestblogs.asia/sitemap.xml

HTML Forms Best Practices for Beginners – Basic

Working with XHTML forms can be somewhat scary; they not only use some place HTML elements, but also blur the line between static content and user interface. Let’s review some stuff to remember when creating your next form.

Good HTML forms require consideration on at least four points:

  1. Semantics
  2. Accessibility
  3. Functionality
  4. Design

Forms can be complicated and sometimes even frustrating for users; often, a form interrupts a user’s main focus and direction on a page: they are plan on purchasing that gifts, or trying out your new web app, not giving you their shipping address or coming up with yet another password. These tips will make forms easier for you as a developer/designer, and them as a user.