navigation

Thoughts & Articles

When is “good enough” not good enough?

slab
Jim Infantino
a rack of sweaters

Photo by Charles Deluvio on Unsplash

We all know the process. We think we are moving in the right direction, gathering the right tools to finish the job and when we see the results we think, “it’s okay. It’s good enough.”

Too many times, this is how some people approach web design.

Perhaps the platform you chose promised the fabled combination of fast-cheap-good, perhaps you saw a template that looked exactly right when you saw the demo but less than great when you put it together. Perhaps you hired someone who promised that the results would be customized to your standards, but spent most of their time managing your expectations. This is the promise of cheap and quick website building.

The truth is, you can either accept these limitations or you can up your game.

Your project is worth going the extra mile. Your brand deserves more. Your website should represent your best ideals and represent your unique offerings. Your website should be the optimal expression of who you are what what you do.

Templates are always going to center around an average. Often they are over-engineered with bulky code to provide limited customization. They offer good design, but it may not be the right design for you.

The template you choose may seem good at the time, but when your site is completed, it probably isn’t good enough.

This is why we build the way we build. We start with a blank slate and design in harmony with your brand from the ground up. We work until we get it right.

We encourage you not to settle for good enough. Instead, let’s build something great.

SEO basics

slab
Jim Infantino

If you're new to search engines or just want to make sure your website is competitive in search, you need to understand a few basic ideas about content and meta information for your pages.

First: What is a page?

For the purposes of search engine optimization, a page is what shows up for each url on your website. That may sound self explanitory, but each page on your website might be made up of multiple entries. Your homepage may contain a few different entries or articles, but to a search engine the homepage is always just https://yourdomain.com/ with or without a slash at the end. Each url is indexed seperately by the search engines, so it's important to be mindful of your meta information (see below) for each unique url on your website.

Second: What is content?

a. Volume

What you write on each page is critical to search. There are no shortcuts to quality, unique and authoritative content. Content is any text and/or images on each page (see above). Search engines like google like to see unique and complete content on each page. This can be a problem for pages that are just a few lines of text, like your contact page, where long content may not be very helpful to a human visitor. Search engines like to see at least 300 words on a page to have enough data to determine what the page is about. That doesn't mean you should stuff every page with useless information. Remember, it's not worth optimizing for search engines if you sacrifice intelligabililty for your visitors. If you so that, people might be able to find your website easier, but once they are there, they might browse away too quickly.

b. Relevence

The copy on each page should be informative and have agreement with your meta information (see below). Search engines have complex algorithms to determine whether your page is an authoritative source. The more the search engines see you as an authoritative source for the information you provide, the higher your score will be and the further up in search results for relevant searches you will place.

c. Accessibility

This is becoming a big issue of late. It's important that your website is easy to read for people who have trouble seeing. In the website design phase, your Slab website should have standardized font-sizing and color contrast to work well for people who have trouble seeing. Our websites are increasing accessible to people who rely on tabbed-based brwosing as well. However, the one thing we encourage you to do is to add ALT-TEXT to ALL of your images. This is in a short text field next to the image upload when you create a new entry on your website. ALT-TEXT should be a simple description of the image you added. If it's a picture of a cup of coffee, it might read "cup of black coffee on a table." Imagine if you couldn't see the page, and you needed someone to tell you about the images on the page. Simpler is better. Do not try to stuff search terms into your image alt-text unless it's directly relevant to what is in the image.

Third: What is meta-information?

If you have our new SEO tool installed you will see an orange tab on each page of your website on the left-hand side, above the SLAB MENU. If not, you can find a meta information button in the controlbar or in the SLAB MENU. There are a few important things to pay attention to the fields in this tool.

a. Meta Title

This is the description of the page that you see in the browser tab. Sometimes it's obscured, but if you hover over it, you will see it there. More importantly, this is the text that shows up at the top of a search result for your website. A good meta title should be unique to the website between 40 and 60 characters long. If your meta title is too long, search engines will rank that page down. Here is a decent example of a meta title:

SEO Basics | Help with Meta Tags | Guidelines | Slabmedia

Notice that the name of our company is at the end. In all likelyhood, getting found for the name of your company is going to be easy, but that doesn't help you win the search wars for terms that people who don't already know about you are searching for. It's good to have your company name in your meta titles on some or most of your pages, but the other text is the priority. This meta title is 57 characters, which is good. You don't want to get to 60 every time. That might actually count against you in the algorithms that run search engines. Getting close is better. This is what will get someone to click on your link in the search results. Make sure your meta titles are relevant to the content on the page and contain phrases that people might actually search for. Determining the optimal search terms is the job of SEO professionals, but it's fine to guess.

b. Meta Description

This is a short description of what is readable on your page. It should be a pithy description of what the page is about. You can pick some content that's already on the page if that content is compelling and descriptive. The guidelines are that these descriptions should be between 150 and 160 characters but you don't need to be exact about that number. You can go a little over, but it's best not to go below 120 or over 220. Using search terms in your description is recommended, but it's unclear if it has a direct effect on your search position. A decent meta description for this page might be:

If you want to make sure your website is competitive in search, here are a few basic ideas about content structure and meta information to keep in mind.

This is (hopefullly) an enticing bit of information that might increase the likelyhood a searcher will click on your link. One last thing to keep in mind is that search engines do not want to see duplicate meta descriptions on your site. Each description, like each site-title should be unique.

c. Meta Image

This is not used for search engines, but they are useful for social media. When you post a url on twitter or some other social platform, it pulls the image from this value. In our SEO tools we allow you to choose an image from the content of your page or related content from the same section to set the meta image.

Are we done?

Those are the basics, according to us. Remember, the algorithms are constantly changing, and SEO is highly competitive. We believe you should be able to take control of your own organic search profile, which includes all of the above points. You may, however, decide that you need more help with your position in search. If so, we have a number of professional SEO packages that can help you when you are ready to invest in better results.

Contact us to find out more.

Ideas for our musician clients during this national COVID19 emergency

slab
Jim Infantino
image of a band performing in a rehearsal space

Photo credit: Harrison Haines from Pexels

Hi to all of our Slabmedia musician and performer clients,


We know that with the current COVID-19 emergency, touring may be harder than ever before, if not prohibitive. We wanted to share some solutions other musicians are using so you have some ideas about options.

Recently, the site I used for live streaming of Jim's Big Ego concerts, concertwindow has closed down but there are some other options for online concerts you should check out:


Your fans want to support you, even in hard times.
patreon.com is a good way to get subscribers that look for your new material. You can post individual songs, home shows, etc.

Two of our artists have made good use of this.
Ellis Paul
Catie Curtis

I have my own, less successful Patreon account patreon.com/jiminfantino - what I would share from my experience is that it's important to plan your tiers well, and make sure you keep adding new material for your fans to stream, read, view, or download.

Those are the ideas I have right now. It might also make sense to think about live outdoor shows when the weather gets better. Outdoor gatherings are risky, but less so. We shall see how that shakes out when the weather gets warmer.

We are wishing you all the best in this emergency. Alexander and I work in two remote offices, so we will be available to help you with your online presence.

best wishes,

Jim Infantino

Bug at Let’s Encrypt Secure Certificates caused problems for some, but not our clients.

slab
Jim Infantino
Beetle close up

Photo by Alan Emery on Unsplash

We are always on the lookout for issues that might affect our clients’ websites and we spotted this one yesterday that described a problem with Let’s Encrypt Certificates. Let’s Encrypt are in use with many of our clients’ websites so we checked in with our server company to be sure nothing would impact our sites.

The problem seems to be that for some users who were applying a single certificate to multiple websites. A breif summary of the above article is that LE was disabling some certificates which were applied to multiple domains for security reasons. None of our clients’ websites were affected thanks to the fact that we employ a superior host company: Pair.com for all of our services.

You can read the full article below:

Upcoming changes to Google Chrome's User Agent String handling

xander
Alexander
mashup of camera and chrome logo

Google recently announced plans to change a longstanding component of their Chrome web browser, the User Agent String, which is a fundamental feature of the browser used to announce to visited websites various information about the end user's browser and device configuration. Present in every major web browser, if not in every single web browser available as well as in non-browser software which connects to the internet, the User Agent String has been a persistent characteristic of internet enabled devices for most of Internet History. So, what is a User Agent String, and why does Google want to change it?

What Is a User Agent String (UAS)?

At its most fundamental, the UAS is a piece of text sent from the browser to an internet server in conjunction with a request for content. The UAS announces various information about the site visitor so that receiving computer can most effectively serve the request. Here is an example of a UAS:

Mozilla/5.0 (iPad; U; CPU OS 3_2_1 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Mobile/7B405

As you can see, various information about the type of device connecting to the server is sent in the UAS, including, according to Wikipedia, "Mozilla/[version] ([system and browser information]) [platform] ([platform details]) [extensions]."

For servers automatically reviewing the UAS, the information in these announcements has become very valuable. For example, when a server is able to recognize that a visitor is browsing from a Windows computer, it can supress messages related to Mac computers, to, say, show only software downloads that are compatible with the visitor's computer. Likewise, if a UAS indicates an older or non-compatible browser version, the server can display an error page or other upgrade instructions for the end user.

Why does Google want to change the UAS? In a word, privacy.

Google's plan is to freeze the UAS around September of 2020, such all Chrome browsers show the same UAS regardless of the device on which they're running. (With the exception that desktop and mobile browsers will still be differentiated). All other information will be standardized in the UAS such that further identification of browser version, operating system, and other details will be uniform. Why would they want to do this?

The move to essentially deprecate the UAS comes as part of Google's "Privacy Sandbox" initative. As you can already see, the existing UAS automatically gives the receiving server *lots* of information about the end user's browser and device configuration, and this information is being exploited to fingerprint individual end users and groups of users to track their web usage for advertising purposes. Google intends to replace the UAS with a new feature set called User Agent Client Hints, which yields much the same information as the traditional UAS but will allow for user customizations to the amount of data chosen to be shared. In essence, the new standard will allow for end users to block components of the UAS which are otherwise unblockable under the current scheme.

The proposed change would appear to be a win for end user privacy. Given that Google is an advertising company, there's an open question as to why it would want to limit user fingerprinting, which presumably would be useful for it to generate more targeted advertising. However, migrating the information provided by the UAS to a user-configurable set of options is a move in the right direction for privacy, regardless.

Graphic Design and CSS: Where WYSIWYG went wrong.

slab
Jim Infantino
Graphic Design and CSS Where WYSIWYG went wrong

Way back in the 1980s, some very nerdy graphic designers began to write their designs in code. John Warnock, one of the founders of Adobe Systems had developed a computer-generated, vector-based programming language called PostScript that became the underlying code of much of the design and typographic tools we take for granted today. This very interface still relies heavily on it. Almost every letter in every word you read uses it. Before there was any such thing as Illustrator or PageMaker or InDesign or web pages, there were designers writing code to make a mark on a page. To draw an arc on a page, a PostScript designer would write in a text document: “300 600 20 270 arc stroke” and then adjust the numbers to get the right result in the compiled output. Crazy, right?

I remember reading about this and thinking,

“That is way too much effort. I could never think about my design in code. Give me a pen or a brush and some paper. That form of design is going nowhere.”

Then, along came the graphic user interface. Now everyone who designs writes about arcs and strokes and fills and vector positions on a virtual canvas, only we use a mouse and a computer screen and let the robots do the calculations for us.

So, what is wrong with that? Courses in Design are taught specifically in branded programs like Adobe Illustrator or Photoshop with little to no reference to the underlying code that enabled the process. These programs have enabled us to create increasingly complex designs quicker than we could possibly achieve by hand-coding positions and shapes, and that is, by-in-large a good thing.

Along came the web page, and some designers and non-designers began to code again. This time, the language was HTML. HTML was created by a group of programmers and thinkers lead by Tim Berners-Lee at CERN. Unlike PostScript, it was not a proprietary system, but open to all to use, free of charge or license. It was meant to deliver linked information through a new idea called a browser. The first browsers were run in a terminal, not a graphical user interface (GUI) as we know now, but a stream of text. Design was not a factor, the sharing of linked information was the primary goal. When GUI browsers arrived in the 1990s, designers looked at the gray pages of Times New Roman on their screens and imagined the possibilities of something more exciting and engaging. Programming Designers, like those who had worked in PostScript developed CSS and Javascript and Web Design was born.

CSS stands for Cascading Style Sheets, and was developed by Håkon Wium Lie in 1994. Lie was also working at CERN with Berners-Lee. His idea was to take the structure of the HTML page; the underlying elements like body, h1 (for a headline), p (for a paragraph), img (for image reference), and div (for block elements) and give them design specifications like background-color, width, height, padding, margins, font-family, and more. Early web designers began to code or “script” these modifications to their dull looking, gray pages and things on the web got interesting.

One year later, Brendan Eich at Netscape released the first version of Javascript, based on the same ideas as Lie’s CSS and things got even more interesting. Programming Designers like Jeffrey Zeldman began using it to animate the elements on the page to create dynamic content that changed on the screen due to variables, some of which could be set by the page visitor. At this point, the web page became a living thing, radically set apart from the printed page.

A very limited number of designers adopted these methods. The creative interface was still a plain text editor, usually a terminal program, to write the HTML code, and script the CSS and Javascript to make it look and feel pleasing to the viewer. Companies like Apple and Adobe saw the opportunity and created WYSIWYG programs so that graphic designers could automatically generate the web code without touching it. That was a good thing too, wasn’t it? Yes, of course, it was and no, it wasn’t.

The trouble with web design today, is that each choice or fork in the process has exponential consequences. Modern web design is often reliant on templates and programs that make it easy to get started and generate a design that is balanced and pleasing. Content editors have drag and drop interfaces that allow for placement of elements with a gesture of a hand. The trouble arrives when a designer begins to think outside the parameters of the pre-defined GUI interface and discovers that they are blocked from making creative decisions by the template or program underlying the layout on the page.

When faced with the limitations of the interface, a non-coding designer has a choice:

  1. Start again with another template or platform that seems to better align with their vision
  2. Give up and let the client know that what they want simply cannot be done with the tools at hand
  3. Team up with a programmer that can shoehorn the elements into place with modifications to the constraining template or platform, adding cost to the project and complexity to changes down the road

None of these are good options.

What is needed to do this right is a designer who thinks in code. What kind of designer thinks in code?

Prior to the term being co-opted by the startup community, a Unicorn used to refer to a designer who was also a programmer. The belief was and is today, that the ability to design precludes the ability to write code and those who did both are as impossible to find as a mythical beast with a horn growing out of its forehead. People who have these talents are now sometimes referred to as hybrid designers for disambiguation.

For too long, the design curriculum has ignored code as an essential interface for creating graphic layouts. The focus has been on the WYSIWYG interface, rather than what it generates. All GUI interfaces for design have limitations in application. A page created in photoshop is, by its nature non-responsive to screen size. It lacks any accessibility elements. It ignores user-input. A website built with templates will always be limited to the parameters of that template, no matter how flexible it claims to be. A designer who wants to rise to the challenges of how layouts are presented on the web must have some understanding of the underlying elements of HTML and how to modify their appearance and behavior to create durable, compelling work.

I can sometimes find examples of educational courses that teach design with code. I hope this is a trend. It is not, as has been asserted, the job of two very different brains to write code and create beauty. One brain will do, so long as we begin thinking of design from the perspective of rules and behavior, rather than fixed elements on a page. The interface is radically different from those based on the ancient modes of paint, brush, line, and type. The interface is the code itself.

Designers still need to understand color, form, and typography, but the tool to create their work is the script they write, not the compiled output of a program created to mimic the methods of the past.

Great web designs are built from the ground up without compromise. Web designers paint on an ever-changing canvas governed by an evolving set of rules. Without a deep understanding of these rules and materials, work created with easy to use but constraining tools will always come up short.

Are you a coding designer? Drop me a line.

– Jim