The Computer Whiz http://nnucomputerwhiz.com Web Design, Performance and Computer Repair Wed, 09 Apr 2014 18:25:19 +0000 en hourly 1 Setup VPS with Debian, Nginx and Virtualminhttp://nnucomputerwhiz.com/setup-vps-with-debian-nginx-and-virtualmin.html http://nnucomputerwhiz.com/setup-vps-with-debian-nginx-and-virtualmin.html#comments Wed, 14 Mar 2012 03:26:43 +0000 admin http://nnucomputerwhiz.com/?p=348 I recently upgraded my VPS to use Nginx and Virtualmin and found it to be a killer setup. Nginx is superfast in a low memory setup and Virtualmin GPL edition is a powerful and free control panel which I have found to be more intuitive than Cpanel. This setup works great with a reasonably priced ($5.95) 512 MB VPS from Burst.net. This guide will walk through what is required to setup a VPS with Debian, Nginx and Virtualmin.

Purchase and Install VPS

The signup process for Burst.net is basically the same as any other hosting company. You have a choice of  plans ranging from 512 Mb ram, 1 Ghz CPU, 20 GB of Disk space, and 1TB bandwidth for $5.95/Month to 4Gb ram, 4Ghz Cpu, 250 Gb Disk Space, and 4Tb bandwidth for $49.95/Month. For this example I am using the 512 Mb option and I would highly recommend it unless you know your traffic load will require much more or you are planning on running more memory intensive apps such as Ruby on Rails or JSP. Upgrading the VPS later is very easy so you can start with the minimum and increase the capacity whenever you need it. The OS I chose and highly recommend is Debian 6 – 32 bit. The setup process is basically the same for Cent OS and Ubuntu but you’ll be on your on if you try a different OS. Why not 64-bit? A 64-bit OS requires basically twice as much memory to keep our costs at a minimum a 32-bit OS will actually work better. For a web server having the extra bits doesn’t provide much benefit so the only reason to use a 64-bit OS is to address more than 4Gb of memory which we cannot do in this VPS anyway.

Login to VPS and update everything

After you purchase the VPS, burst.net and most other VPS hosts will send a welcome letter with your VPS’s IP and root password like so:
VPS IP Address: 184.22.251.243
SSH Root Password: v49o2k0vAQ

Then you can connect to your server with the command “ssh root@184.22.251.243″ if you’re using Linux or OS X. If you are using Windows I would recommend putty for a ssh client. Once you are logged in I recommend running “passwd” to change your password to something more memorable.

It is recommend you update all packages before you install Virtualmin. Run the commands:
apt-get update
and
apt-get dist-upgrade

Enable Dotdeb packages

Although not required I highly recommend enabling the Dotdeb repository which contains the most recent versions of php (5.4) and nginx (10.0.13).

Add repositry to sources.list with the command:
echo "deb http://packages.dotdeb.org squeeze all" >> /etc/apt/sources.list
Add the the appropriate GnuPG key with:
wget http://www.dotdeb.org/dotdeb.gpg -O - | apt-key add -
Then update
apt-get update

Install Virtualmin GPL

The default install includes a few unnecessary packages namely ClamAV used for virus scanning of email which uses a huge amount of CPU resources and memory. Since this is a low memory system and the installation will run out of memory if ClamAV is install we need to modify the installation script to not install ClamAV. Either remove clamav from  debdeps= in install.sh or download a modified install.sh. You can do it directly on the server with the command:

wget http://nnucomputerwhiz.com/wp-content/uploads/install.sh

The start the install script with:

/bin/sh install.sh

Press ‘y’ to continue installation. If the install scripts detects that you don’t have a fully qualified domain name it will prompt you to enter it. Such a name would be server.example.com. It doesn’t matter if the domain is currently registered and pointing to your IP address but you should pick one that you could setup properly when needed. The install script may ask for the name of your primary network interface, in my case this was ‘venet0:0′ but you can find it by running ‘ifconfig’ and looking for the device with your primary IP assigned to it. In most cases the installer will figure it out automatically.

Once that is complete the installer will begin downloading all required packages for a working system from your distribution’s sources. It will also add it’s own repository to your /etc/apt/sources file which will be used for the installation and update of Virtualmin itself. If all goes well Virtualmin will install sucessfully and you can login to it at https://ipaddress:10000. It will run a post-installation wizard which asks a few configuration questions. I recommend Chooseing ‘No’ For Preload Virtualmin libraries? and Run email domain lookup server?. Virus scanning should say it does not know how to start clamd which is a good thing because we don’t want that. Choose ‘No’ for Run SpamAssassin server filter. Choose ‘Yes’ for Mysql server and ‘No’ for postgresql. Set mysql password. MySQL configuration size “small system”.  Enter the primary nameserver for the system. In most cases this will be itself. I highly recommend “Only store hashed passwords” for the password storage mode.

After the post-installation wizard finishes you’ll want to go to System Settings -> Features and Plugins and uncheck the “Virus Filtering” option or Virtualmin will complain about not having clamav installed.

Install Nginx and Nginx Virtualmin plugin

These steps are taken from the official documentation.
Shutdown and disable apache: /etc/init.d/apache2 stop ; update-rc.d apache2 remove
Install Nginx: apt-get install nginx
Start Nginx: /etc/init.d/nginx start
Install Virtualmin Nginx Plugin: apt-get install webmin-virtualmin-nginx webmin-virtualmin-nginx-ssl

  1. Login to virtualmin as root (https://ipaddress:10000)
  2. Go to System Settings -> Features and Plugins
  3. Un-check the “Apache website” , “SSL website” and “DAV Login”, “Mailman”, “Protected web directories”, “AWstats reporting” and “Subversion repositories” features.
  4. Check the “Nginx website”, then click “Save”.
  5. Do to a small bug you cannot select “Nginx Website” and “Nginx SSL Website” at the same time. So go back and select “Nginx SSL Website” after enabling “Nginx Website”.
  6. Go to the System Information page and click Refresh system information in the top right.
  7. Verify that running appears next to Nginx in the Status section.
  8. It’s also a good idea to click on System Settings -> Re-Check Configuration after everything is done to ensure it’s work properly

Now you can add Nginx sites using the Create Virtual Server link as you would with Apache. The only thing different is “Enable Nginx” needs to be checked in the features section.

Conclusion

I have found this setup to be extremely fast and flexible. Virtualmin automatically configures Nginx with fastcgi to load PHP as the domain owners user. Each virtual server also has it’s own php.ini, tmp, and logs folder which makes managing separate users and sites a breeze. Virtualmin also takes care of managing email accounts, ftp accounts, email aliases, Mysql Databases, DNS records and more. So for $5.95 you have complete control of your hosting environment with a intuitive interface for creating and administering an unlimited number of domains all for the price of a shared server. No extra license feeds are needed for Cpanel or artificial limits  imposed on the number of databases, domains, or email accounts.

 

No related posts.

]]>
http://nnucomputerwhiz.com/setup-vps-with-debian-nginx-and-virtualmin.html/feed 2
How to style input tag file uploads in Webkit (Chrome, Safari)http://nnucomputerwhiz.com/how-to-style-input-tag-file-uploads-in-webkit-chrome-safari.html http://nnucomputerwhiz.com/how-to-style-input-tag-file-uploads-in-webkit-chrome-safari.html#comments Sun, 04 Mar 2012 08:54:15 +0000 admin http://nnucomputerwhiz.com/?p=325 If you’ve ever been tasked with creating a consistent user experience across all web browsers you’ve probably had a real challenge styling form input elements. Historically IE has been the most challenging to style but with file input elements webkit is almost impossible to change it from the default “choose file” dialog. Typically the file input is styled by setting it’s opacity to 0 and absolute positing it over an image of a browse button with whatever style is desired. The downside of this is that is requires JavaScript to update a text input element after a file is chosen with the file selection dialog. This technique uses pure CSS to present a consistent interface to all major browsers with a few vendor CSS prefixes.

Basic unstyled file input:

Screenshots of popular browsers:
Soon to be added.

As you can see the file upload buttons all look about the same with the exception of webkit based browsers, Chrome and Safari which show the “Choose File” button with the value of the file chosen shown to the right of that. This technique will change not only the content of the button but it’s position relative to the text of the file chosen.

input[type=file] {
-webkit-appearance: textfield;
position: relative;
-webkit-box-sizing: border-box;
}
input[type=file]::-webkit-file-upload-button {
width: 0;
padding: 0;
margin: 0;
-webkit-appearance: none;
border: none;
}
/* "x::-webkit-file-upload-button" forces the rules to only apply to browsers that support this pseudo-element */
x::-webkit-file-upload-button, input[type=file]:after {
content: 'Browse...';
display: inline-block;
left: 100%;
margin-left:3px;
position: relative;
-webkit-appearance: button;
padding: 3px 8px 2px;
}

Now our input field looks like:

Cool huh?
So how does this work?
The secret is in “input[type=file]::-webkit-file-upload-button” which lets us select the “Choose File” button itself. We hide this and then add our own button with “input[type=file]:after” it is preceded by “x::-webkit-file-upload-button, ” which causes all browsers that don’t support the -webkit-file-upload-button selector to ignore the rule. If we didn’t add this other browser might try to add the content which might mean there would be two browse buttons displayed. “-webkit-appearance” is a useful rule to give any element whatever native look we like here we use it to make the :after content look like a button and the regular input element to look like a textfield. Now webkit’s style will match IE and Firefox with the brows button located after the box with the selected file.

No related posts.

]]>
http://nnucomputerwhiz.com/how-to-style-input-tag-file-uploads-in-webkit-chrome-safari.html/feed 8
How to get cash for old computershttp://nnucomputerwhiz.com/how-to-get-cash-for-old-computers.html http://nnucomputerwhiz.com/how-to-get-cash-for-old-computers.html#comments Mon, 05 Sep 2011 17:55:38 +0000 admin http://nnucomputerwhiz.com/?p=310 If you’re like me you probably have a small or not-so-small collection of computer stuff most of which is obsolete, broken or both. Well now it’s possible to clean out your closet and get some cash for it as well. If you help your friends and neighbors get rid of their old computer junk it could become a significant amount of cash thanks to cash for electronic scrap.

I had a garage full of old and broken computers, scanners, printers and network hardware that needed cleaned and I wanted to see what I could get for it. The total count was about 15 towers, 6 laptops, 2 scanners, 1 largish copier, 2 inkjet printers, 1 24-port 10baseT network hub. I discovered Cash for electronic scrap which allows you to send them the electronic scrap via fedex and they extract the gold, silver and other metals after which they send you a check based on what the scrap was worth. The process is entirely risk free since they pay for the shipping with a pre-paid shipping label that can be printed from there site.

To prepare my shipment, I removed every component from the computers and other devices. They only take the electronic circuit boards and processors so to maximize your profit it’s best to remove anything that doesn’t have value from the shipment to ensure it is as light as possible. The cost of the shipment is likely deducted from the value of the recycled materials so the less excess stuffed shipped the better. So I spent an afternoon removing all the motherboards and add-in cards from each computer and putting them in a box. Cash for electronic scrap requests that the components be separated by type so the CPUs I put in one box and the rest of the printed circuit boards when in another. I strip out the circuit boards from the hard drives, CD-ROMS and floppy drives. I also disassembled and removed the circuit boards from the printers and other electronics. The entire process took several hours.

After all was done I had 42 pounds of electronic scrap that I was able to fit into one box. Print a FedEx label from their site was straightforward. I tapped the label to the box and dropped it off at a local FedEx shipping location. About 2 weeks latter I received a check for $68.25. Much more than I was expecting for electronic scrap. As proof here’s a scan of the check:

It’s impossible to know exactly how much electronic scrap is worth until the actual work is done to extract the precious metals. The towers I sent in were from the Pentium I and II era. The 5 laptops were mostly Pentium 4′s and I doubt the scanners and printers really added any value. As a general rule of thumb the older the computer the more it’s worth and CPU’s are probably the most valuable component.

So if you got a lot of computer junk around I would highly recommend Cash for electronic scrap. Preparing the shipment does take some work but so would hauling the stuff off to the dump so Cash for electronic scrap is a great option if you actually want rewarded for cleaning out the garage. For one or two computer it’s probably not worth the effort. It’s also possible to get old computers for free by checking out the free section on craigslist or watching the postings on FreeCycle. I found a few useful computers this way as well as tons of junk computer that I later recycled for cache.

If you use Cash for electronic scrap please leave a comment and let people know what you sent in and how much it was worth.

No related posts.

]]>
http://nnucomputerwhiz.com/how-to-get-cash-for-old-computers.html/feed 1
Cheap VPS ~ 512mb for $5.36/monthhttp://nnucomputerwhiz.com/cheap-vps-for-5-95month.html http://nnucomputerwhiz.com/cheap-vps-for-5-95month.html#comments Sat, 23 Jul 2011 21:03:15 +0000 admin http://nnucomputerwhiz.com/?p=312 If you need root access but don’t want to any more than a shared hosting account price then check out burst.net. I have found them to be reliable and fast with reasonable support.
UPDATE: Now till end of Augest get %10 off any Linux or Windows VPS  in any location. Use promotion code “summer10″. Check their twitter feed for more promotions. So instead of $5.95/Month you can get a 512MB VPS for $5.36/Month and this rate is not limited to the first 3 months or 6 months it is used as long as you keep the product.

I was in the market for a VPS and found one that was only a few more dollars than I was currently paying for my shared hosting account with full root access and all the other frills of owning a VPS. I needed root access to setup the Nethack game portal I was making and I needed enough ram to run Ruby on Rails. There’s lots of companies offering a VPS for under $10 these days but burst.net had good reviews with no complaints except from spammers complaining about getting suspended. I started out with they cheapest option of 512 MB RAM and 20 GB of space for $5.95 / month and latter upgraded to 1.5 GB of ram and 100 GB of space for $19.95 / month.

Performance is obviously not has good as any dedicated machine would be but is just as good as any shared hosting account. I ran phpspeed which is a nice quick MySQL/PHP server benchmark on the VPS right after it was setup. The score I got was about 25% higher than when I ran the same test on my existing shared hosting provider. I’ve clocked the bandwidth I’m currently getting at 7 MBps with a simple process of downloading a large file from burst.net to a dedicated host I have access to.

Uptime has been pretty good. Over the past year and a half there have been a few power failures resulting about 8 hours of downtime. A couple of times I’ve been migrated to a new node with just a few minutes of downtime. So it’s very good for a host at this price. Support requests have been handled in a timely manor usually just a few hours. They were very knowledgeable too especially with usually problems like when I was using too much Disk I/O running Squid Cache Proxy software.

There other products are very reasonably priced with dedicated hosting starting less than $50. So unless you need absolute 100% uptime and support that will hold your hand and set everything up for you I highly recommend burst.net

No related posts.

]]>
http://nnucomputerwhiz.com/cheap-vps-for-5-95month.html/feed 0
IE9 image load event bughttp://nnucomputerwhiz.com/ie9-image-load-event-bug.html http://nnucomputerwhiz.com/ie9-image-load-event-bug.html#comments Mon, 06 Jun 2011 22:49:09 +0000 admin http://nnucomputerwhiz.com/?p=283 I just found a problem with IE9 in how it handles the load event for img tags. With IE8 and earlier if I wanted to perform some JS function after an image loaded I could do something like this:

img = new Image();
img.src = ‘some.jpeg’;
if(img.complete){
somefunc();
} else {
img.attachEvent( “onload”, somefunc );
}

but with IE9 the event will never fire regardless of whether the image file is in the browser cache or not. To fix this you just need to set the img src after attaching the event or reset the img src after attaching the event like so:

img = new Image();
img.src = ‘some.jpeg’;
if(img.complete){
somefunc();
} else {
img.attachEvent( “onload”, somefunc );
img.src = img.src;
}

Hopefully this will help someone else out running into this same problem and encourage the IE team to fix it.

No related posts.

]]>
http://nnucomputerwhiz.com/ie9-image-load-event-bug.html/feed 6
Use webp images with jpeg fallbackhttp://nnucomputerwhiz.com/use-webp-images-with-jpeg-fallback.html http://nnucomputerwhiz.com/use-webp-images-with-jpeg-fallback.html#comments Thu, 05 May 2011 03:52:57 +0000 admin http://nnucomputerwhiz.com/?p=282 Behold the Power of WebP

If the below image has “webp” in the corner your browser can display WebP. If it has “jpeg” it is because your browser cannot show WebP images but the user experience remains the same. webp is a new image format pioneered by Google based on the VP8 video codec to reduce the download size of images on the web. What I am presenting is a way to offer WebP images to browsers that support them while falling back to jpegs so that visitors still have the same user experience.

What happened to  your hand?
This page uses javascript to tests if your browser supports webp and loads the appropriate format. Native browser support is limited to Chrome 9+ and Opera 11.10 but thanks to @antimatter15 and his work WebP images can be displayed in any browser that supports WebM. It creates a single frame WebM video and places it in a <video> tag to replace the <img> tag. I have modified his original code to add support for jpeg fallback when WebP isn’t supported. This allows WebP to be used right now without showing broken images to browser that don’t support WebP.

Get the Javascript Code for WebP support here.

Baby Jesus and Mary

Using it

Add <script src="weppy.js"></script> anywhere on the page but for the best performance add it add the end of your document right before the </body> tag. Then create your img tags like so:
<img alt="Baby Jesus and Mary" class="webp" data-src="baby-jesus-and-mary.jpg" />
data-* is a new html5 attribute is a completely valid way to provide arbitrary data accessible by Javascript. At first I tried setting the “src” to the jpeg and switching it to the WebP image after the javascript test to verify support for WebP. However doing this causes both image formats to be downloaded which means the page loads much slower. To prevent the jpeg from being loaded until needed the “src” must be empty and the “data-src” is used instead.
I am willing

Browser support and issues

Native Webp support is currently available in Chrome 9+ and Opera 11.10. Support for WebP using a WebM video tag is available in
Chrome 7.0+, Opera 10.62+, Firefox 4.0 and even IE 9 with the WebM MF Components or IE6+ with Chrome Frame. I would estimate about 25% of internet users will be able take advantage of the WebP with weppy.js. In the next few months it will probably increase dramatically as more people adopt Chrome or upgrade to Firefox 4.. The major downside is that images will not work when javascript is turned off or web bots that don’t support javascript and won’t know about the image unless the “src” is valud. Since we are using the data-src attribute and dynamically changing it with javascript it’s difficult for any web spider to actually find the images. Depending on whether you want your website to show up in online image searches it could be a real downside to using WebP images. One workaround would be to duplicate each image in <noscript> tags like so:


<img data-src="image.jpg" class="webp" alt="an image" width="100" height="100" />
<noscript>
<img src="image.jpg" alt="an image" width="100" height="100" />
</noscript>

The major disadvantage with doing this is that it clutters up the html but it will ensure give everyone and everything can access your images.

Limitations of WebP images as WebM video

Since it creates a video tag from an img tag it’ll behave much differently. Naturally any css applied to img tags will have to be modified to apply to video tags as well. With an img tag the browser scales it to fit whatever width and height are set, but with video tags it will be scaled but the aspect ratio will always be the same as the native size of the image; it won’t be stretched to fit. since you should be setting the width and height to the image’s natural width and height this shouldn’t be an issue.

XmlHttpRequest is used to load the WebP image and create the WebM video and there may be a bit of a performance penalty because the image has to be loaded as binary data which has a little overhead. Also the WebP images have to be downloaded from same domain unless Cross-Origin Resource Sharing is used. Which basically means that the WebP file has to be served with the HTTP header:
Access-Control-Allow-Origin: *

WebP in CSS

There is currently no way to use WebP images in css backgrounds without native support and I havn’t found a good method of falling back to jpegs with pure css. weppy.js does add a “webp” class to the html element similar to how modernizer works. However this happens too late in the document loading process to prevent loading any jpeg images specified in the CSS. The only way I’ve seen to use WebP images in CSS is with server side browser detection to serve a special css file to browsers that support WebP. A similar method could also be done to create the src attributes for img tag eliminating the need for javascript. The downside to these methods is that they are based on the User-Agent header which is not always predictable and the User-Agent detection rules would have to be updated when new browsers add support for WebP.

No related posts.

]]>
http://nnucomputerwhiz.com/use-webp-images-with-jpeg-fallback.html/feed 5
Install X11 on OS X 10.4 Tiger Without the Install DVDhttp://nnucomputerwhiz.com/install-x11-on-os-x-10-4-tiger-without-the-install-dvd.html http://nnucomputerwhiz.com/install-x11-on-os-x-10-4-tiger-without-the-install-dvd.html#comments Fri, 22 Apr 2011 04:57:00 +0000 admin http://nnucomputerwhiz.com/?p=278
  • Install Adobe Flash Player 10.1
  • ]]>
    I’m always surprised how hard it is to install standard open source packages on OS X and X11 is no different. Installing it on 10.4 is virtually impossible without the original installation DVD which I never have handy. I got this from James Martin but thought it needed an extra mirror because it was painfully slow downloading it from there. Here it is for your convenience.

    Download X11 for OSX Tiger Here

    Enjoy

    Related posts:

    1. Install Adobe Flash Player 10.1

    ]]>
    http://nnucomputerwhiz.com/install-x11-on-os-x-10-4-tiger-without-the-install-dvd.html/feed 0
    Eshop Authorize.net AIM Integrationhttp://nnucomputerwhiz.com/eshop-authorize-net-aim-integration.html http://nnucomputerwhiz.com/eshop-authorize-net-aim-integration.html#comments Mon, 18 Apr 2011 18:56:44 +0000 admin http://nnucomputerwhiz.com/?p=273 Eshop is a nice ecommerce plugin for WordPress but it does not support taking credit card information directly on the site but redirects the customer to Authorize.net site or Paypal or some other payment processor. I had a need to accept cards without redirecting the customer to a external site so I wrote a patch for Eshop. Download patch here

    The patch is a patch file and you’ll need to use the unix patch command to apply it to an existing eshop folder. It was created with eshop 6.2.2 but might work on other version with some tweaking. Download eshop 6.2.2 here.

    To patch run the command:

    patch -p1 < authaim.patch

    Within the eshop directory. The patch file will have to be located in the directory as well.

    If you don't have unix or can't use patch here's a prepatched version:
    Prepatched eshop.6.2.2-aim

    I make no promises that it will actually work properly so test it before installing it on a live site.

    Leave comments if you have any problems and I'll see what I can do.

    I try to get a patched version for 6.2.8 soon.

    No related posts.

    ]]>
    http://nnucomputerwhiz.com/eshop-authorize-net-aim-integration.html/feed 5
    Superfast embedded fonts with @font-facehttp://nnucomputerwhiz.com/superfast-embedded-fonts-with-font-face.html http://nnucomputerwhiz.com/superfast-embedded-fonts-with-font-face.html#comments Sun, 06 Mar 2011 06:38:10 +0000 admin http://nnucomputerwhiz.com/?p=256 Now that font embedding is supported in all major browsers it has become increasingly common thanks to the bulletproof embedding technique. However embedded fonts can add several milliseconds if not whole seconds and can even block rendering if not done correctly. Here you will find how to mitigate the performance implications if not eliminate them completely.

    Without optimization

    Font Squirrel is the place to go when you need embeddable fonts. They have a large selection of free fonts and a good font converter. For this test we will be using Josefin Slab. If you download the @font-face kit it will give you all 10 styles in 4 different formats and the css needed to make them work in all browsers. O for the day when we will just need to worry about one format, alas. As it is there is four but for simplicities sake and to target the worst offender we’ll focus on IE and its propitiatory format eot. Demo Page here.
    Test results for default demo page
    Load time – 2.405s
    Obviously there’s room for improvement.

    Remove unneeded files

    10 styles is nice but way overkill. Lets condense it to 4, regular, bold, italic and bold-italic, that ought to keep all our designers happy. We could probably get rid of bold-italic but we’ll keep it just to add a challenge. It’s important to note that IE downloads every font specified in the style sheet regardless of whether it’s actually used on the page. So it will be downloading our four fonts even if there is no bolded and italicized text. New demo. Notice in the css I changed the font-family of each @font-face rule to be the same and changed the font-weight and font-style property to match to font file. This allows us to use <em> or <b> tags as we normally would without having to specify a different font-family.
    Test Results: 1.340s Load time

    Optimize webserver delivery

    Removing all the extra styles helped a lot but that was the easy part. If you look at webpagetest’s optimization checklist it suggests doing two things to how the fonts are delivered: gzip compression and enable browser caching. The exact process for enabling gzip compression depends on which web server you are using. For the widely popular Apache it can be done in the .htaccess file. Simply add these lines:


    AddType font/ttf .ttf
    AddType font/otf .otf
    AddType application/x-woff .woff
    AddType image/svg+xml .svg
    AddType application/vnd.ms-fontobject .eot

    <IfModule mod_expires.c>
    ExpiresActive On
    ExpiresByType font/ttf "access plus 10 years"
    ExpiresByType font/otf "access plus 10 years"
    ExpiresByType application/vnd.ms-fontobject "access plus 10 years"
    ExpiresByType application/x-woff "access plus 10 years"
    ExpiresByType image/svg+xml "access plus 10 years"

    </IfModule>

    <IfModule mod_headers.c>
    <FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf|eot|woff|otf|ttf|svg)$">
    Header set Cache-Control "max-age=290304000, public"
    </FilesMatch>
    </IfModule>

    AddOutputFilterByType DEFLATE application/vnd.ms-fontobject
    AddOutputFilterByType DEFLATE font/ttf
    AddOutputFilterByType DEFLATE font/otf
    AddOutputFilterByType DEFLATE image/svg+xml

    First we add the type for each font we so apache knows what to call things. Next we use either mod_expires or mod_headers to set the cache control so the browser does not need to re-download the font file. Lastly we add the DEFLATE filter to each font type so that they are gzip compressed. Note: we don’t add deflate to the woff font type because it is natively compressed.

    It is actually possible to natively compress eot fonts using eotfast. It’s Windows only software so I had to run it in a VM but it worked. Had problems getting the GUI to work so I just used the command line. It lives up to it’s claims of reducing font sizes up to 70% and the eot files it produces are about 5-10% smaller than gzip versions of the squirrelfont generated eot files. So it’s a good idea to use it but not necessary.

    Our Speed Test without eotfast now shows a load time of 0.9 s and there is still room for improvement. Notice that the repeat view load time is much lower now since we implemented caching.

    Just for fun I ran the Speed Test with Eotfast which shows 0.890s load time. Wow, it loaded a whole hundredth of a second faster! (sarcasm) So if you really want to be as fast as possible use eotfast but you probably won’t notice the difference.

    Embedded the Font files using mhtml

    Base64 encoded data urls have long been used to save http requests for small background images and the like. They were not supported by IE until version 8 and IE8 limits them to 32 Kbs which is actually too small for some fonts. Fortunately IE6+ support a similar technology called mhtml which can be used to embed the fonts directly into our css file and save http requests. I decided to try it out to see if pages loaded any faster. It turns out they don’t.

    For details on how to use mhtml see the proper mhtml syntax.

    Demo page with mhtml embedded fonts.

    The tests are below:
    mhtml embeded font files with eotfast Load time 1.076.
    mhtml embeded font files without eotfast Load time 1.044.

    Besides the load time being longer this approach is very poor because the start render time is actually 1.148s because the browser doesn’t render the page until the large css file is downloaded. I thought I could get arroudn this by mhtml embedding the font files into a separate file. IE supports any file as long as it sent with a text content type. The demo page speed test has a better load time at 0.954s but it’s still longer than 0.980s. I believe this is because of several factors. First base64 encoding adds about 30% to the total file size which is mostly mitigated by using gzip compression. Also it seems IE has to work harder to decode the mhtml file because the repeat view load time is longer even though it does not have to redownload anything. The thing that really helps separate eot files load faster is that IE can download them in parallel. So the verdict is, don’t use mhtml for fonts or anything that is more than a few kilobytes.

    Fonts and Start to Render Time

    For the most part fonts do not affect your time to start rendering which is very good. One can safely embed fonts using the @font-face which only adds a few milliseconds to the total page load time. What most browsers do is render the entire page except the text that depends upon an external font file it will appear  hidden until the file is downloaded. Except Firefox which shows a default font until the font file is loaded after which the fonts noticeably change. This behavior can be changed with the WebFont Loader which is a javascript library from Google. There is one small problem with Internet Explorer. If there are any <script> tags before the <style> tag with the @font-face it will block rendering of the entire page until the font file is downloaded. So it is always good practice to include your <script> tags just before the </head> tag or, even better, at the bottom of the page.

    Font delivery networks

    Gzip compression and Expire headers go a long way to help decrease the font load time but to load the fonts as fast as possible they need to be served from a superfast server. It also helps if the server is geographically close to the site visitor. This is not much different than a content delivery network, just specific to font files and the CSS to support them. There are several commercial font delivery networks, TypeFont lets you upload any font which it will then serve from their network for  your site. TypeKit has a large selection of fonts which can be used from their network for the price of a subscription. Google offers their web font API which serves a limited number of fonts from Google servers. Using the service is very easy just add a line to load the css file in the <head> section:

    <link href=’http://fonts.googleapis.com/css?family=Josefin+Slab:regular,regularitalic,bold,bolditalic’ rel=’stylesheet’ type=’text/css’>

    This loads the Josefin Slab font in regular, bold, italic and bold italic. Running a speed test shows that using Google the median load time is 1.250s. What more than 350ms longer? Well if we look closely at the tests we see that my web server varies greatly in how long it takes to search pages. But if we just look at the time spent serving the fonts and css we see my server to 627 ms while google servers took 798 ms. 275 ms are spent doing DNS lookups so google’s servers are 104 ms faster. Of course this is doing a test from the same country in which my server is located. From different countries the difference would be much greater. So using Google Font API may or may not be faster than serving the files from your own server. I noticed that Google could save a DNS lookup if they served their fonts from the same domain as their CSS file. Also they are employing a User Agent test when serving the CSS file which saves a few bytes because you only send the relevant @font-face rules but User Agent aren’t always a reliable means of testing for browser capabilities. Actually Google is cheating in this case and only serves one font instead of the four font variants to IE. IE still shows bold and italic fonts but they don’t look the same as the real font files. I guess Google thinks that IE just can’t handle the extra files. It sends all four to Chrome although they are sent as ttf instead of woff. In my testing woffs are usually smaller so it would make more sense to send woff files to the browsers that support them.

    Conclusion

    For my own purposes I found that using embedded fonts with the above htaccess changes works very well and the performance of the page is not affected much. I much prefer to use my own server to deliver fonts where I have complete control of which fonts are servered and avoid extra DNS look ups. mhtml or data inline fonts do not increase performance. However 4 different font styles are probably not needed. 1 would actually be enough for most purposes but I choose to use 4 in these examples so that the load time of the fonts would be more predominant. Please let me know how you embed fonts in the comments below.

    No related posts.

    ]]>
    http://nnucomputerwhiz.com/superfast-embedded-fonts-with-font-face.html/feed 9
    Bulletproof and fast css gradientshttp://nnucomputerwhiz.com/bulletproof-and-fast-css-gradients.html http://nnucomputerwhiz.com/bulletproof-and-fast-css-gradients.html#comments Wed, 26 Jan 2011 06:12:32 +0000 admin http://nnucomputerwhiz.com/?p=246 I am daring to claim that I have found the best bulletproof way to create css3 background gradients. It is the best because it guarantees that extra http requests are not made unless needed and bulletproof because it works in every browser falling back to a normal background-image based gradient if necessary.

    UPDATE: Added new rules for IE10 and Opera 11.10!

    For the impatient here is the code:

    background: #FFF url('grad.png') repeat-x left top; /* fallback to image */
    background-image: none,url('grad.svgz'); /* gziped svg for Opera < 11.10 and IE9, base64 encoding desirable */
    background-image:none\9; /* IE hack to prevent ie from needlessly downloading the fallback image. It will use the filter instead*/
    background-size: 100% 100%; /*scale background image if possible */
    background: -webkit-gradient(
    linear,
    left bottom,
    left top,
    color-stop(0, #FFF),
    color-stop(1, #000)
    ); /* For older Safari / Chrome */
    background: -moz-linear-gradient(
    bottom,
    #FFF 0%,
    #000 100%
    ); /* Firefox 3.6+ */
    background: -webkit-linear-gradient(
    bottom,
    #FFF 0%,
    #000 100%
    ); /* Recent Chrome and Safari */
    background: -o-linear-gradient(
    bottom,
    #FFF 0%,
    #000 100%
    ); /* Opera 11.10+ */
    background: -ms-linear-gradient(
    bottom,
    #FFF 0%,
    #000 100%
    ); /* IE10 */
    background: linear-gradient(
    bottom,
    #FFF 0%,
    #000 100%
    ); /* Current standard if anyone ever supports it */
    filter: progid:DXImageTransform.Microsoft.gradient(endColorstr='#ffffff', startColorstr='#000000'); /* for IE */

    .gte9 tag {
    filter: none;
    }
    You can see the demo page here.

    UPDATE Sept 25, 2011

    Now that IE9 is out some slight changes should be made to make this just a little bit better. IE9 added support for svg backgrounds so it behaves just like Opera did with this CSS. Except now Opera has added support for css gradients using the -o prefix since version 11.10 so a rule for that was added. The filter works fine for IE9 and a conditional comment style rule could be added to prevent IE9 from downloading the svg file and just rely on the filter property but I feel as this is not ideal. First of all filter is old and in the past has been known to cause performance penalties. The real deal breaker is that filter does not play well with the new border-radius css3 property with IE9 has added support for. Using gradients with the filter property combined with border radius will cause the background to spill out of the rounded corners and not look very good. If SVG backgrounds are used it works fine. So I recommend using base64 encoded data uri to save a request to the svg file and using condition comments to add a filter: none rule for IE9. NOTE: It doesn’t work to base64 encode a svgz file only svg will work.

    Except for the fallback image and the svg image for Opera, CSS gradients can easily be created using the Ultimate CSS gradient generator. The generator will give you a good start but needs to be expanded to be completely bulletproof.

    The first line:

    background: #FFF url('grad.png') repeat-x left top;
    Uses a common 1 px wide 50px high image with a gradient created with the GIMP going from black to white. If you want to ensure the gradient looks exactly the same in every browser you could first create the gradient with CSS and then take a screenshot of it in Firefox or Chrome.

    SVG for Opera

    background-image: none,url('grad.svgz');
    This line is for Opera and other browsers that support svg but not true CSS gradients. How to create the svg file I will cover later. The important bit that other bulletproof techniques leave out is the “none,” part. This is the scheme for specifying multiple CSS backgrounds which means that browser that don’t support multiple backgrounds will ignore this line and the svg image. It’s a pretty safe bet that if a browser doesn’t support svg images it won’t support multiple backgrounds so this works very well. IE, However, interprets this line a little differently but it’s to our advantage. It parses up to the comma and ignores the rest effectively setting the background to none. This is a good thing since it prevents IE from downloading the fallback background image and just using the filter.

    For Webkit

    background: -webkit-gradient(
    linear,
    left bottom,
    left top,
    color-stop(0, #FFF),
    color-stop(1, #000)
    );

    The syntax is as follows:

    -webkit-gradient(<type>, <point> , <point> [, <stop>]*)

    <type> is either linear or radial. For now I’m just going to talk about linear gradients. In another article I’ll describe how to make cool list item bullets using radial gradients. The <points> are the starting points and ending points respectively. Then can be specified using x y coordinates using px or % units (ie. 0px 10px) or the keywords top, left, right or bottom. One or more < stops> are used to specify the color of the gradient.  Such as color-stop(x, <color>), where x is a decimal between 0 and 1, 0 being the color at the starting pint and 1 being the color at the ending point and anything else being the colors in between . <color> is any valid css color., clever things can be achieved using rgba style colors. Webkit has recently introduced a new syntax for gradients that are similar to the -moz style. They are outlined on webkit’s blog. It is recommended that you still include the old style because browser that support the new style are not yet widely deployed.

    For Firefox

    background: -moz-linear-gradient(
    bottom,
    #FFF 0%,
    #000 100%
    ); /* Firefox 3.6+ */

    Firefox’s syntax is:

    -moz-linear-gradient( [<point> || <angle>,]? <stop>, <stop> [, <stop>]* )

    Point is the starting position of the gradient specified in px, %, or the keywords left, right, top, bottom or middle. Unless defined angle will be 180deg, -45deg would make a gradient from one corner to the next. Multiple color stops are specified with any valid css color and a percentage value. See mozilla’s site for complete details.

    The dreaded Internet Exploder

    filter: progid:DXImageTransform.Microsoft.gradient(endColorstr=’#ffffff’, startColorstr=’#000000′); /* for IE */

    IE uses a very basic propriety feature that only supports linear gradients with two colors in the horizontal or vertical direction. However it is supported in IE 5.5+ which is rather impressive. All that needs specified is the start color and end color. You can use size hex digits or 8 which is similar to rgba where the first two hex digits are the alpha value. So you can actually create a background that is half transparent with:
    filter: progid:DXImageTransform.Microsoft.gradient(endColorstr='#88FF0000', startColorstr='#88FF0000');
    This would create red background that is half transparent.

    There is some concern that using IE filters can cause performance problems with rendering the page and may not actually be the best. I have done a few tests using webpagetest.org. Here is a test of the demo page using a background image. The median load time is 0.555s. Another test using IE filter gradients. The median load time is 0.537s. The demo page uses several very large gradients and they load 18ms faster than the image based gradients. The difference would probably be much greater with a complex page with several different kinds of gradients.

    So filter gradients don’t pose much of a performance penalty but they can cause fonts to be displayed without anti-aliasing also known as cleartype. This apparently is a change the IE developers in their wisdom decided to make with IE7 and kept in IE8. Elements with a css filter don’t use cleartype so the fonts look terrible. There is a easy work around that works in IE8 but not IE7. Simply create an inner div or other element inside the element with the filter gradient, then give this div “position: relative;”. The fonts will then be rendered correctly. It is also worth noting that gradients will not show up in IE6 or 7 unless the element haslayout. This can easily be fixed by giving the element a width or using the zoom: 1; property.

    Finally to create the svg image, open a text editor and enter:

    <svg xmlns="http://www.w3.org/2000/svg" version="1.0"> <defs> <linearGradient x1="0" y1="0" x2="0" y2="100%"> <stop offset="0%" style="stop-color: #000000;"/> <stop offset="100%" style="stop-color: #ffffff;"/> </linearGradient> </defs> <rect x="0" y="0" fill="url(#gradient)" width="100%" height="100%" /></svg>

    Save as grad.svg. The syntax is similar to CSS3. x1, y1, x2, y2 are the starting and ending points. The stop tags are the stop colors. Mostly the only browser that will use this are Opera and Ios before v4. It does not save an http request (unless you convert it to a base64 data url) but it does provide true flexible gradients. Optionally you can gzip the file with gzip and change the extension to svgz which saves a few bytes. Interestingly the resulting file was 219 bytes which is actually larger than the 1x50px png file which was 178 bytes. For svg’s to work they must be sent with the “Content-type: image/svg+xml” header or they will just be interpreted as plain text which is not what we want. Usually webserver will send them correctly but if not you can fix apache by adding this to the .htaccess file:

    AddType image/svg+xml .svg

    So there you have flexible and fast css gradients which use a minimum of http requests. Please leave a comment if you have any success using this method. Next I will look at the cool things you can do with radial gradients.

    No related posts.

    ]]>
    http://nnucomputerwhiz.com/bulletproof-and-fast-css-gradients.html/feed 3