Author Archives: ssmith82

Good find/ will help me brush up on code!

This is a nice handful of resources for web development. I have been out of the html writing world for at least a year and a half. and I need a refresher. http://www.hongkiat.com/blog/sites-to-learn-coding-online/

Protect the image from the HARVESTERS

Image Harvester

by Philip Guo (philip@pgbovine.net)

April 2005

INTRODUCTION

I have developed an easy-to-use Python script for automatically harvesting JPEG images from a website and a selected number of websites linked from that starting site. It uses the free GNU Wget program to download images, and a number of heuristics to try to grab only images from the most relevant sites. It can be thought of as a more specialized and ‘intelligent’ Wget.

USER MANUAL

To use my Image Harvester script, simply download image-harvester.py, and run it on a computer with Wget installed in a directory where you want to download the images:

python image-harvester.py <url-to-harvest> 

This script downloads all .jpg images on and linked from <url-to-harvest>, then follows all webpage URL links on that page, downloads images on all those pages, and then follows one more level of webpage URL links from those pages to grab images, except that this time it only follows URLs in the SAME domain to prevent jumping to outside sites. It creates one sub-directory for images downloaded from every page that it crawls to.

Your choice of <url-to-harvest> is important in determining how many images this script can harvest. For optimum results, try to choose a page that contains lots of images that you want and also lots of links to other pages with lots of images. The maximum depth of webpage links that this script follows is 2, but that should be enough for most image harvesting purposes. Additional levels of recursion usually results in undesired crawling to irrelevant sites.

The Image Harvester script cannot distinguish between the images that you want to keep from ones that you don’t (e.g., thumbnails, ads, and banners). I have written a Image Filterer BASH shell script that tries to filter out undesired images based on a heuristic of dimensions. If either an image’s width or height are below some minimum threshold (350×350 is what I use), then it’s probably a thumbnail, ad, or banner that you don’t want to keep. This script uses the ImageMagick inspectprogram to inspect the dimensions of all .jpg images, throw away the ones that don’t meet some minimum threshold, and then throw away sub-directories that don’t contain any more images.

To filter your images, download keep-images-larger-than.shand run it in the same directory where you ran image-harvester.py:

./keep-images-larger-than.sh <min-width> <min-height> 

This will first create sub-directories named small-images-trashand small-images-trash/no-jpgs-dirs to store the filtered-out files and directories, respectively. Then it will find all .jpg images within all sub-directories and move any file whose width or height is less than <min-width> or <min-height>, respectively, into small-images-trash. As a last step, it will move any directories that contain no .jpg images into small-images-trash/no-jpgs-dirs. These trash directories provide a safety net to protect against accidental deletions. After running the Image Harvester and Image Filterer, your sub-directories should be filled only with full-sized images that you want to keep.

CODE

Here is the code. Please give it a shot and email me with feedback if you have trouble getting it to work or want me to add additional features.

MOTIVATION

Here are the two main problems that I’ve experienced with automated web crawling and downloading tools, and how this project tries to solve them:

  1. The recursion grows out of control, and the tool ends up crawling to irrelevant sites and downloading tons of images that you don’t want, like annoying ads or banners.

    To solve this problem, I apply a few heuristics to ensure that my script has the best chance of only grabbing the images that you want without crawling to irrelevant sites:

    • It first grabs all images on and linked from <url-to-harvest>, which is either a page with lots of images or a ‘links page’ that links to other pages with lots of images. Your choice of the starting page is important in determining how many images this script can retrieve.
    • Then it crawls to all page links from <url-to-harvest>and grabs all images off of those pages. It must follow all links during this step because you may start at a ‘links page’ which provides links to sites at many domains with images that you want to grab.
    • Then it performs one more level of crawling from those pages (linked from starting page) to only pages in the SAME domain. When you are already at a site that contains the images you want, it may have links to related pages in the same domain with additional images. Links to outside domains at this point are likely to be ads, redirects, or other irrelevant sites.

     

  2. Automated tools get blocked by many webservers because they hog up bandwidth and allow people to download images without viewing the requisite ads and generating revenue for the site.

    Here is how my script tries to prevent getting rejected by servers:

    • Tells Wget to imitate the Firefox web browser in its HTTP request User Agent field in order to hopefully not trip the anti-leeching mechanisms of servers.
    • Tells Wget to slow down its requests and provide a randomized variation on times between requests in order to not overload web servers and reduce suspicion that it’s actually an automated tool instead of a human clicking on and downloading images.

    Warning: It is not polite to download large numbers of images from websites in an automated fashion because it eats up bandwidth without the need to actually view the content of the sites. Please do not abuse my Image Harvester script by using it to download too many images at once. Whenever possible, browse the actual sites first to show courtesy and support to them, because the webmasters expect you to view their contents.


I need the explanation for images and my webpage well here is what I found thank you web

No one likes waiting for a webpage to load. It’s frustraing and almost as if the site’s host is asking for you to take your attention elsewhere. Both Google and Amazon have established through testing that higher webpage load times lead to lower traffic and conversions. When Google Maps reduced the size of their homepage, traffic increased 10%. Amazon’s tests showed that for every 100 millisecond increase in load time, sales decreased by 1 percent.

Prioritizing image formatting is one of the easiest ways to speed up your site’s load time. Unfortunately, many people gloss over their image formatting and navigating their website takes much longer as a result.

Here’s an example of two images, one that has been optimized and one that hasn’t, in case you don’t believe me.

Last month, Google began accounting for the annoyance of waiting for pages to load by incorporatingsite speed into their search rankings, giving web hosts tangible incentive to start paying more attention to page load times. Since image formatting contributes so much to a page’s load time, here’s a quick breakdown on how to optimize the images on your website without purchasing any photo-editing software.

How big should your image’s file size be?

An image’s file size can vary greatly and it is the biggest determining factor in how fast an image will load to a webpage. High-tech cameras take extremely detailed pictures, resulting in huge file sizes, some are even several megabytes. Ideally though, an image that’s uploaded to your webpage will be under:

  • 100kb for a large image
  • 50kb for a medium image
  • 30kb for a small image

Here’s a free online tool that allows you to sacrifice a little image quality in exchange for a reasonable file size and quick load times.

Web Resizer

What file extension should you use for your images?

The three file extensions you should consider for your website’s images are .jpg,.png and .gif

  • .jpg  – Use for images with more color like photos and images with gradients. 

    JPEG images are made up of millions of different colors but can be used at a lower quality to greatly decrease the file size. Typically JPEGs can be compressed 10:1 without much perceived loss in image quality.
     

  • .png – Use for images with less color like charts and graphs or images with large areas of the same color, especially those requiring semi-transparency. 

    PNG images were created as a free substitute for the GIF format because GIF images are patented and require developers to pay a license fee. In fact, PNG actually stands for “PNG’s Not GIF.” PNG images are superior to GIFs in many ways including: 

    • Typically PNGs can be compressed further than GIFs
    • PNGs are capable of varying levels of transparency
    • PNGs can have a wider range of color depths

    Note for developers: here is a workaround for incorporating PNG support into IE6
     

  • .gif – Use for images with less color like charts and graphs or images with large areas of the same color. 

    GIF images are limited to 256 different colors but because image programs have the ability to control the number of colors that are in each image, charts and graphs with few color varieties can be stored at very small file sizes in the GIF format. GIF and PNG images are similar but GIFs are superior in the following ways:

    • GIFs can suport animated images
    • While PNG images are widely supported, they aren’t well supported in Internet Explorer 6.0 andsome HTML email clients.

To convert your image to either a .jpg, .png or .gif file extension, you can this free online image converter tool.

Online Image Converter

What is the image title and image alt text?

The image title and alt text are used to describe the image you’re placing both to viewers and to search engines.

  • This is a demonstration of the yellow box  displaying the title tag that shows up when you mouse over an imageImage Title – The image title, as the name might suggest, is a short title of the image you’re placing on the webpage. It’s also what shows up in that yellow box when you mouse over an image in many browsers (mouse over the image to the right for an example).
  • Image Alt Text – The alt text for an image is alternative text to describe an image in detail for search engines. An image’s alt text will also display instead of the the image for people who have images disabled in their browser. Instead of seeing an empty image icon, the viewer will get a description of the image, which could inspire that person to enable images and see what you’re trying to show them.

You can add an image title and image alt text in the HTML of a web page but many Content Management Systems, like ours, allow you to easily add them when you’re inserting the image. 

If you’ve got any good practices for optimizing an image for the web, feel free to let us know in the comments.

How we google can ensure the speed of search resuts

Google incorporating site speed in search rankings

by  on APRIL 9, 2010

in GOOGLE/SEO

(I’m in the middle of traveling, but I know that a lot of people will be interested in the news thatGoogle is incorporating site speed as one of the over 200 signals that we use in determining search rankings. I wanted to jot down some quick thoughts.)

The main thing I want to get across is: don’t panic. We mentioned site speed as early as last year, and you can watch this video from February where I pointed out that we still put much more weight on factors like relevance, topicality, reputation, value-add, etc. — all the factors that you probably think about all the time. Compared to those signals, site speed will carry much less weight.

In fact, if you read the official blog post, you’ll notice that the current implementation mentions that fewer than 1% of search queries will change as a result of incorporating site speed into our ranking. That means that even fewer search results are affected, since the average search query is returning 10 or so search results on each page. So please don’t worry that the effect of this change will be huge. In fact, I believe the official blog post mentioned that “We launched this change a few weeks back after rigorous testing.” The fact that not too many people noticed the change is another reason not to stress out disproportionately over this change.

There are lots of tools to help you identify ways to improve the speed of your site. The official blog post gives lots of links, and some of the links lead to even more tools. But just to highlight a few, Google’s webmaster console provides information very close to the information that we’re actually using in our ranking. In addition, various free-to-use tools offer things like in-depth analysis of individual pages. Google also provides an entire speed-related mini-site with tons of resources and videos about speeding up websites.

I want to pre-debunk another misconception, which is that this change will somehow help “big sites” who can affect to pay more for hosting. In my experience, small sites can often react and respond faster than large companies to changes on the web. Often even a little bit of work can make big differences for site speed. So I think the average smaller web site can really benefit from this change, because a smaller website can often implement the best practices that speed up a site more easily than a larger organization that might move slower or be hindered by bureaucracy.

Also take a step back for a minute and consider the intent of this change: a faster web is great for everyone, but especially for users. Lots of websites have demonstrated that speeding up the user experience results in more usage. So speeding up your website isn’t just something that can affect your search rankings–it’s a fantastic idea for your users.

I know this change will be popular with some people and unpopular with others. Let me reiterate a point to the search engine optimizers (SEOs) out there: SEO is a field that changes over time, and the most successful SEOs embrace change and turn it into an opportunity. SEOs in 1999 didn’t think about social media, but there’s clearly a lot of interesting things going on in that space in 2010. I would love if SEOs dive into improving website speed, because (unlike a few facets of SEO) decreasing the latency of a website is something that is easily measurable and controllable. A #1 ranking might not always be achievable, but most websites can be made noticeably faster, which can improve ROI and conversion rates. In that sense, this change represents an opportunity for SEOs and developers who can help other websites improve their speediness.

I know that there will be a lot of discussion about this change, and some people won’t like it. But I’m glad that Google is making this step, both for the sake of transparency (letting webmasters know more about how to do better in Google) and because I think this change will make the web better. My takeaway messages would be three-fold: first, this is actually a relatively small-impact change, so you don’t need to panic. Second, speeding up your website is a great thing to do in general. Visitors to your site will be happier (and might convert more or use your site more), and a faster web will be better for all. Third, this change highlights that there are very constructive things that can directly improve your website’s user experience. Instead of wasting time on keyword meta tags, you can focus on some very easy, straightforward, small steps that can really improve how users perceive your site.

 

How we google can ensure the speed of search resuts

Google incorporating site speed in search rankings

by  on APRIL 9, 2010

in GOOGLE/SEO

(I’m in the middle of traveling, but I know that a lot of people will be interested in the news thatGoogle is incorporating site speed as one of the over 200 signals that we use in determining search rankings. I wanted to jot down some quick thoughts.)

The main thing I want to get across is: don’t panic. We mentioned site speed as early as last year, and you can watch this video from February where I pointed out that we still put much more weight on factors like relevance, topicality, reputation, value-add, etc. — all the factors that you probably think about all the time. Compared to those signals, site speed will carry much less weight.

In fact, if you read the official blog post, you’ll notice that the current implementation mentions that fewer than 1% of search queries will change as a result of incorporating site speed into our ranking. That means that even fewer search results are affected, since the average search query is returning 10 or so search results on each page. So please don’t worry that the effect of this change will be huge. In fact, I believe the official blog post mentioned that “We launched this change a few weeks back after rigorous testing.” The fact that not too many people noticed the change is another reason not to stress out disproportionately over this change.

There are lots of tools to help you identify ways to improve the speed of your site. The official blog post gives lots of links, and some of the links lead to even more tools. But just to highlight a few, Google’s webmaster console provides information very close to the information that we’re actually using in our ranking. In addition, various free-to-use tools offer things like in-depth analysis of individual pages. Google also provides an entire speed-related mini-site with tons of resources and videos about speeding up websites.

I want to pre-debunk another misconception, which is that this change will somehow help “big sites” who can affect to pay more for hosting. In my experience, small sites can often react and respond faster than large companies to changes on the web. Often even a little bit of work can make big differences for site speed. So I think the average smaller web site can really benefit from this change, because a smaller website can often implement the best practices that speed up a site more easily than a larger organization that might move slower or be hindered by bureaucracy.

Also take a step back for a minute and consider the intent of this change: a faster web is great for everyone, but especially for users. Lots of websites have demonstrated that speeding up the user experience results in more usage. So speeding up your website isn’t just something that can affect your search rankings–it’s a fantastic idea for your users.

I know this change will be popular with some people and unpopular with others. Let me reiterate a point to the search engine optimizers (SEOs) out there: SEO is a field that changes over time, and the most successful SEOs embrace change and turn it into an opportunity. SEOs in 1999 didn’t think about social media, but there’s clearly a lot of interesting things going on in that space in 2010. I would love if SEOs dive into improving website speed, because (unlike a few facets of SEO) decreasing the latency of a website is something that is easily measurable and controllable. A #1 ranking might not always be achievable, but most websites can be made noticeably faster, which can improve ROI and conversion rates. In that sense, this change represents an opportunity for SEOs and developers who can help other websites improve their speediness.

I know that there will be a lot of discussion about this change, and some people won’t like it. But I’m glad that Google is making this step, both for the sake of transparency (letting webmasters know more about how to do better in Google) and because I think this change will make the web better. My takeaway messages would be three-fold: first, this is actually a relatively small-impact change, so you don’t need to panic. Second, speeding up your website is a great thing to do in general. Visitors to your site will be happier (and might convert more or use your site more), and a faster web will be better for all. Third, this change highlights that there are very constructive things that can directly improve your website’s user experience. Instead of wasting time on keyword meta tags, you can focus on some very easy, straightforward, small steps that can really improve how users perceive your site.

 

How we google can ensure the speed of search resuts

Google incorporating site speed in search rankings

by  on APRIL 9, 2010

in GOOGLE/SEO

(I’m in the middle of traveling, but I know that a lot of people will be interested in the news thatGoogle is incorporating site speed as one of the over 200 signals that we use in determining search rankings. I wanted to jot down some quick thoughts.)

The main thing I want to get across is: don’t panic. We mentioned site speed as early as last year, and you can watch this video from February where I pointed out that we still put much more weight on factors like relevance, topicality, reputation, value-add, etc. — all the factors that you probably think about all the time. Compared to those signals, site speed will carry much less weight.

In fact, if you read the official blog post, you’ll notice that the current implementation mentions that fewer than 1% of search queries will change as a result of incorporating site speed into our ranking. That means that even fewer search results are affected, since the average search query is returning 10 or so search results on each page. So please don’t worry that the effect of this change will be huge. In fact, I believe the official blog post mentioned that “We launched this change a few weeks back after rigorous testing.” The fact that not too many people noticed the change is another reason not to stress out disproportionately over this change.

There are lots of tools to help you identify ways to improve the speed of your site. The official blog post gives lots of links, and some of the links lead to even more tools. But just to highlight a few, Google’s webmaster console provides information very close to the information that we’re actually using in our ranking. In addition, various free-to-use tools offer things like in-depth analysis of individual pages. Google also provides an entire speed-related mini-site with tons of resources and videos about speeding up websites.

I want to pre-debunk another misconception, which is that this change will somehow help “big sites” who can affect to pay more for hosting. In my experience, small sites can often react and respond faster than large companies to changes on the web. Often even a little bit of work can make big differences for site speed. So I think the average smaller web site can really benefit from this change, because a smaller website can often implement the best practices that speed up a site more easily than a larger organization that might move slower or be hindered by bureaucracy.

Also take a step back for a minute and consider the intent of this change: a faster web is great for everyone, but especially for users. Lots of websites have demonstrated that speeding up the user experience results in more usage. So speeding up your website isn’t just something that can affect your search rankings–it’s a fantastic idea for your users.

I know this change will be popular with some people and unpopular with others. Let me reiterate a point to the search engine optimizers (SEOs) out there: SEO is a field that changes over time, and the most successful SEOs embrace change and turn it into an opportunity. SEOs in 1999 didn’t think about social media, but there’s clearly a lot of interesting things going on in that space in 2010. I would love if SEOs dive into improving website speed, because (unlike a few facets of SEO) decreasing the latency of a website is something that is easily measurable and controllable. A #1 ranking might not always be achievable, but most websites can be made noticeably faster, which can improve ROI and conversion rates. In that sense, this change represents an opportunity for SEOs and developers who can help other websites improve their speediness.

I know that there will be a lot of discussion about this change, and some people won’t like it. But I’m glad that Google is making this step, both for the sake of transparency (letting webmasters know more about how to do better in Google) and because I think this change will make the web better. My takeaway messages would be three-fold: first, this is actually a relatively small-impact change, so you don’t need to panic. Second, speeding up your website is a great thing to do in general. Visitors to your site will be happier (and might convert more or use your site more), and a faster web will be better for all. Third, this change highlights that there are very constructive things that can directly improve your website’s user experience. Instead of wasting time on keyword meta tags, you can focus on some very easy, straightforward, small steps that can really improve how users perceive your site.

 

responsive web

I have been reading a series of web developer books. The topics are quite interesting. The majority of reasons why developers are making the web more responsive between the viewable source. The smart phone and e-readers and I-pads have been challenging developers. I like the concerns I have been searching the internet on a cell phone and it is a total pain. With the introduction of these manners of handling the way we view has made it more inviting. good stuff. With the introduction of this just another area for a business to concern themselves how they are viewed on the internet.
 

A rebuttal to the the cons

Needless to say, having a web site to promote your business or ideas is imperative nowadays. Nearly all households have access to the internet, sometimes even on more than one computer. The great thing about the internet is that it can reach the whole world, and not just your neighbourhood. Thus, if you have anything to sell or anything to say, everyone can get to know about it.

Your web site is your shop window and it is open 24/7. It needs to be eye catching and transmit your message in a few seconds. It needs to have enough content that explains your message and what you do. It needs to load fast, be reliable and look professional. The impression your web site should give is that of a company or professional who knows what they want to sell, or the message they want to send.

Having a web site is not just a commodity for businesses nowadays, but a must. A company or a professional without a web site is like a salesman without his business card. The web site needs to be your profile and what’s good about it is that there is limitless information you can put on it. Another good thing is that static content on your site can be changed very easily. Even better, you can opt to go for a dynamic site which will automatically change your content based on changes to your business, like stock items, prices, articles, services and more.

Promoting your web site is the next step after having designed and developed one. The best way to do this on the web is by using the search engines. After all, the people you want to visit your site have no other choice to find you except from the search engines, unless they already know of your site. This is normally done through a search engine optimization process directly on your site. Other forms of promoting a site are; paid advertising with the search engines and inbound links from other related or non-related sites.

Your web site is your identity, make sure you have one!

Sandro Azzopardi has over 20 years experience in IT. He has worked on various projects with small to large organizations. Together with a team of other professionals, he is specializing in Web Packages, offering a complete web solution on [http://www.cbswebpack.com/packages.html]

Article Source: http://EzineArticles.com/?expert=Sandro_Azzopardi

Article Source: http://EzineArticles.com/232366

Constricting business?

This topic is wide open. 

Does web constrict business?

I understand all of the pros. One can reach and audience that they are not normally reaching. An e-commerce can sell items 24-7.

What negatives does the web offer?

 Competitors seem to rise out of no where. Business also compete with themselves If they have a webpage< depending on the type of business>.

Services are becoming obsolete. travel, stock brokers, post office.

Security issues rise, with hackers, and digital terrorism there is an added expenditure.

Lose of productivity ib the work field. 

Consider the source in which you are reading.

I figured I would give the cons considering all the pros we are all aware of and enjoy.