Quantcast
Channel: Viget Articles
Viewing all 1271 articles
Browse latest View live

Launching Documents of Freedom, A Digital Take on the Traditional Textbook

$
0
0

The Bill of Rights Institute is dedicated to teaching young people across the country about America's founding history. Besides offering access to the the texts of founding documents like the Constitution and Bill of Rights, the Institute also works creatively and vigorously to provide teachers with guidance and innovative materials. And, as today's student body becomes more and more tech savvy, that means turning to digital for new ways to engage students.

Explore the Course

The Institute called upon Viget to build a custom authoring platform that would give them the freedom (no pun intended) to create, edit, and publish a digital high school textbook, available across multiple devices, complete with teacher lesson plans, student readings, activities, quizzes, and videos. The project came with a variety of exciting technical challenges, and, in meeting them, we’ve been thrilled to help the Institute reach more students and teachers.

The Textbook, Redefined

Our platform, built in Ruby on Rails and hosted on Heroku, allows teachers to create free accounts, select their individual teaching standards (state, national, AP), and view teacher-only and student content mapped to their specific standards.

Standards

Don’t have time to create a lesson plan? No worries, each reading has a corresponding lesson plan. Not sure what type of activities to conduct in class? We’ve got that for you too.

Activities

Through extensive user research, we determined that a responsive design would be critical to the success of Docs of Freedom. Teachers and students are viewing and digesting content in all ways, shapes and forms, including desktop, tablet, mobile, slow connection, and no connection. By developing a platform that is responsive across all devices, and by allowing users to download PDFs of materials and readings to use offline, we are ensuring an easy-to-use and engaging experience for all environments.

We Did Our Homework

Because this is the first authoring platform Viget has built, a lot of upfront user research and team collaboration happened in the early phases. We really needed to understand how teachers worked, what their pain points were, and how they searched for and digested classroom content. Because of this upfront investment, we were able to gain the trust of the the Insitute’s team and set the tone for a great working relationship. Our clients were enthusiastic, engaged, and up for pushing boundaries of what was possible.

Survey Stats

We are really proud of the powerful platform we built for the Institute. For more details, check out our full Case Story. We believe it is an innovative take on the outdated textbook, meets the needs of teachers nationwide, and will continue to adapt to and engage with students over time.

Head of the Class

Within six days of launch, Docs of Freedom had over 585 teachers sign up from over forty-five states and the District of Columbia—with 776 number of sign-ups to date.

View DocsofFreedom.org and try not to learn something… we dare you!


Comic Con and Creativity

$
0
0

Earlier this month, I was fortunate enough to take some time off to attend the New York Comic Con. While I considered my trip a mini-vacation, it taught me a few important lessons in creativity. Initially, Comic Con helped me reconnect with my younger self through the comic books I grew up with. But as the convention went on, it exposed me to others' experiences as I spoke to artists and learned about their creative processes and passions. Those two things together reminded me that it's important to remember that there are other creative disciplines that we can learn from outside our own.

Growing up, I spent a lot of time reading comic books. While comic books are at the core of the convention, it has now assimilated video games, TV shows and movies, transforming it into a conglomeration of creative professionals who are sharing their own experiences and promoting their own stories. So, at it's core, the conference is all about storytelling through visual language. Looking back, I realize that comic books and video games were a large influence on my creative profession. I drew the characters I related to, looked up to, and wanted to be like. While my Comic Con trip was recreational, the way it reconnected me to my earliest source of inspiration made it as inspirational as (if not more than) some of the obviously professional industry conferences I've attended in the past.

Comic Con also allowed me to reflect on others' experiences as I spoke to artists in the Artist Alley. In doing so, I had a chance to learn about their creative processes, passions (and spend a lot of money on artwork). David Mack was one of my favorites. As a writer and artist, his reputation for painted collage-like work for Kabuki and Daredevil precedes him.

Art from Daredevil / Kabuki by David Mack

Speaking to the artists and seeing the work first-hand reiterated the importance of the process of creating the stories that became building blocks of my youth. The connections I made with these stories were what really inspired me to be creative and enjoy storytelling.

alvin-lee-streetfighter2

Street Fighter II - Good Vs. Evil Cover by Alvin Lee

josh-clarke-zelda
Zelda Propaganda Posters by Josh Clarke

Everyone has a unique story of their own, from which they draw to inform their intuition. We have our own unique passions and interests that lead us to our creative fields, whether you geek out about video games or comic books. However, we often separate what we do recreationally from our work. Once we connect those things and apply the things we love most to our creative outlet, we can learn and grow in unprecedented ways.

Getting Crafty

$
0
0

Here at Viget, we have been using ExpressionEngine as our primary off-the-shelf CMS for years. Craft and Statamic, both of which are developed by EE add-on developers, have really caught our attention. We finally determined that Craft would be an appropriate solution for a project, and while building the site, I built a couple small plugins.

Bust

I use Grunt to handle generating my CSS and JS files, and I needed a way to cache bust those files so that a user would never see the cached version of the asset when there was a new version available. So I made a quick plugin to add a timestamp to a file.

Usage

<link href="{{ craft.bust.er('/css/application.css') }}" rel="stylesheet">

Outputs:

<link href="/css/application.css?1372022079" rel="stylesheet">

You can download Bust on Github

Listing Section

The site I was building had a few areas that were Structure Sections, essentially they were hierarchical pages. Some of those pages output entries from other Channel Sections on the site, but used the same basic layout as the other generic pages.

I needed to come up with a solution to output those entries on certain pages, so my immediate instinct was to use a segment from the URL and output the appropriate entries. This is how my generic page template started:

{% extends "_layouts/_base" %}

{% block title entry.title %}
{% set entries = craft.entries.find({ section: craft.request.lastSegment }) %}

{% block content %}
	<div class="main__content">
		<div class="text">
			<h1>{{ entry.title }}</h1>

			{{ entry.body }}
		</div>

		{% if entries | length %}
			<ul>
				{% for entry in entries %}
					<li>
						<a href="{{ entry.url }}">{{ entry.title }}</a>
					</li>
				{% endfor %}
			</ul>
		{% endif %}
	</div>
{% endblock %}

So technically that worked. But, it relied on the slug of the page being equal to the handle for the Section. Kinda brittle, kinda lame. Back to the drawing board.

I started to look into what it would take to build a fieldtype in Craft and was pleasantly surprised at how simple it was. So I had the idea to build a fieldtype that would be a select that listed all of the Sections in Craft. Then, a user could just choose the section to display on the page.

My favorite part about building a fieldtype? You get to use Twig for the views! The following is the actual select that gets displayed on the Section entry page:

<select name="{{ name }}" id="{{ name }}">
	<option value=""></option>
	{% for option in options %}
		<option value="{{ option.handle }}">{{ option.name }}</option>
	{% endfor %}
</select>

With the fieldtype built, I added a field named Listing Section to my Page Section.

Listing Section Fieldtype

Then in my template, I adjusted it to account for the new listingSection field, which outputs the selected Section handle:

{% extends "_layouts/_base" %}

{% block title entry.title %}

{% block content %}
	<div class="main__content">
		<div class="text">
			<h1>{{ entry.title }}</h1>

			{{ entry.body }}
		</div>

		{% if entry.listingSection %}
			{% include "_shared/_listing_" ~ entry.listingSection ignore missing %}
		{% endif %}
	</div>
{% endblock %}

You can download Listing Section on Github

Craft has been a pleasure to work with, and hopefully the future will provide us with more opportunities to enjoy it!

Streamline Your Workflow with the GA API

$
0
0

While Google Analytics (GA) is our go-to analytics tool across almost every project, sometimes pulling many pieces of data from within the interface grows tedious and time consuming. In these instances, GA’s Core Reporting API can save us a lot of time and frustration. This post will show you how we’ve modified a third-party script to make API pulls reliable and painless.

How it works

There are a number of ways to access the GA API, whether through Java, PHP, Python or Javascript. A while back some of our developers even whipped up a Ruby wrapper for the GA API. While each of these methods has its place, our most-used tool is a Google Spreadsheets plugin by Mikael Thuneberg called Data Fetch Functions. The script provides a means of accessing granular bits of data in a straightforward and flexible format.

To get started with the script, create a new Google Spreadsheet and open Tools > Script Gallery. Search for ‘Google Analytics Data Fetch Functions’ and install the version created by Mikael Thuneberg. Once that script is installed, open it up with Tools > Script Editor to get a better understanding of what it does and how it works. This German blog post provides a great summary of how to get up and running (Google Translate FTW!).

Once you’ve generated an authentication token and grabbed your GA profile ID, get ready to put the ‘fun’ back in function (I’m truly sorry). To start out, we can use the getGAData() function to find the number of site visits for a specific time period as follows:

=getGAdata({authToken}, {profileNumber}, “ga:visits”, 9/1/2013, 9/30/2013)

Once you have that function working, feel free to get a bit more fancy by adding in advanced segments, filters and sorting to capture unique snapshots of your data. As a quick reference, the order of parameters is as follows:

  • token
  • profile number
  • metrics
  • start date
  • end date
  • filters
  • segments
  • sort (TRUE or FALSE)
  • include headers (TRUE or FALSE)
  • max rows
  • start from row

Google’s documentation provides details about specific formatting requirements:

The Fun Stuff

The real value of the API is that it allows us to pull otherwise disparate bits of data into a single view, to quickly update date ranges for recurring reports, and to then perform additional calculations on the fly. The flexibility of the GA Data Fetch Functions within Google Spreadsheets presents opportunities to create highly customized dashboards or hand-selected groups of KPIs.

In addition, the API can save a ton of time when we want to collect many bits of data that would prove tedious within the GA interface. For example, imagine we have a list of page paths for our website and we want to see, for each page, the number of visits that viewed that particular page during their visit. Pulling each of these numbers from within the GA interface would be tedious and time consuming, requiring a unique Advanced Segment for each calculation. With the GA Data Fetch Functions, we can simply create a function to dynamically calculate each of these requests and we can pull the data almost instantaneously.

This example spreadsheet gives an idea of how we might approach this problem. Once the formulas are configured with the correct absolute and relative references, it’s easy to create a list of hundreds of unique segments and to update the data on the fly!

Except...

...there is one big ‘but’. To protect itself from receiving more data than it can handle, Google places limits on the maximum number of API requests from any given account.

Several quotas apply:

  • 50,000 requests per project per day
  • 10,000 requests per profile per day
  • 10 queries per second (QPS) per IP
  • 10 concurrent requests per profile

We’ve never run up against the first two quotas, but numbers three and four can prove challenging when we’re creating workbooks with lists of unique queries. Fortunately, a clever workaround allows us to temper the rate at which we submit queries so that we avoid a nasty error message.

Cascade your queries

When pulling a long list of queries, the easiest way to avoid the quota limit is to hold the execution of each query until the previous query has completed. We use the following formula:

=if(or(isnumber(Previous Query Result),Previous Query Result = “No data found”),getGAData(...),””)

Simply wrap each getGAData() function in the list (except the first one) in this if-statement to cascade execution.

10 queries per second

Cascading queries takes care of the concurrent request limit, but sometimes the API actually pulls data so quickly that we plow through more than 10 queries within 10 seconds. To avoid this problem, we can add a sleep timer to our Data Fetch Functions script. Simply add the following snippet immediately before your getGAdata() function is declared:

Utilities.sleep(1001);

This little guy makes each of our queries wait for 1.001 seconds and ensures we don’t bump up against the 10 QPS limit. While it slows our process a bit, it’s a necessary evil to avoid quota errors.

Caveats and Final Thoughts

Even with all its utility, this toolset carries with it a few important caveats:

  • Data sampling is not apparent. GA samples data for queries on large sets of data or for very specific subsets of data (more details here). The GA interface provides a notification that data is sampled, but no such notification exists from within the Data Fetch Functions. Be aware of this fact and be sure to duplicate a few of your queries from within the interface to understand whether or not your data is sampled.
  • Data formats aren’t pretty. Each time queries are re-executed, new values will ignore previously set formatting within Google Spreadsheets. In cases where we want a pretty final product, we pull our data into one sheet, usually called ‘Raw’, and then do our nice formatting in a totally separate sheet by referencing values across sheets.
  • Durability isn’t guaranteed. While the Data Fetch Functions have proved robust enough for most of our applications thus far, it’s important to remember that they weren’t officially developed by Google, so support for future changes and updates may be slow or limited.

Huge kudos to Mikael Thuneberg for developing the original Data Fetch Functions. We’ve found them hugely useful and continue to use them on a near-daily basis. In addition the Data Fetch Functions available for Google Spreadsheets, he’s created an even more robust Supermetrics Data Grabber for Excel that provides a number of additional features.

Let us know how you fare with the Data Fetch Functions, particularly when pulling large data sets. Have you found other ways to quickly pull large numbers of unique queries? We’re always on the lookout for ways to improve the way we work!

Visualize Traffic Data on Site Maps Using Google Analytics and Omnigraffle

$
0
0

Two of our greatest loves at Viget are information architecture and analytics data -- but, it can be a challenge to easily visualize both at the same time.  A crucial part of re-architecting a site involves understanding which buried, but high-trafficked, pages could be better bubbled up and which prominent, but less-popular, pages could be de-prioritized.  Google Analytics does a great job showing your top pages, but what about your not-top pages?

On a recent project, KV, Eli, and I created a process for colorizing Omnigraffle site maps based on the number of views a page received or based on the date the page was last modified.  Download the template here and try it for yourself!  Note that this only works for Macs.

A final product looks like this:

In summary, you pick ranges for individual colors:

then hold down the “B” button in Omnigraffle and click either the grey “Pageview Bucket” or “Date Bucket” buttons.  All the individual pages will then fill with the appropriate color.

Here’s the complete process we use for making this as speedy as possible:

1)  As you’re building your site map, ensure each of your page objects includes three Data Keys: class, last, and pageviews.  You can add these by opening the Omnigraffle Inspector and navigating to Properties > Note.

2)  Copy and paste the colorized buckets and grey buttons from the template onto your canvas.

3)  Use your standard analytics tool to pull pageview data for each individual page.  I typically automate this by finding a client’s XML site map, pasting it into a Google Spreadsheet column, and using the Google Script that Mitch describes here to automatically pull data for each individual URL.

4)  Here comes the manual component: copy and paste each of those pageview values from the Google Spreadsheet (or your own analytics tool) into the associated pageviews Data Value for each page in the site map.

5)  Click on the grey “Pageview Bucket” button and open the Inspector.  Navigate to Properties > Options.  You’ll see a script that looks like this:

6)  Copy and paste this script into a text editor and edit the if-statements to fit the ranges that make sense for your project:

repeat with colorScale in (every solid of every group of canvas of front window where value of user data item "class" is "pageviewBucket")
if value of user data item "id" of colorScale is "scaleOne" then set colorOne to fill color of colorScale
if value of user data item "id" of colorScale is "scaleTwo" then set colorTwo to fill color of colorScale
if value of user data item "id" of colorScale is "scaleThree" then set colorThree to fill color of colorScale
if value of user data item "id" of colorScale is "scaleFour" then set colorFour to fill color of colorScale
if value of user data item "id" of colorScale is "scaleFive" then set colorFive to fill color of colorScale
if value of user data item "id" of colorScale is "scaleSix" then set colorSix to fill color of colorScale
if value of user data item "id" of colorScale is "scaleSeven" then set colorSeven to fill color of colorScale
end repeat

repeat with thePage in (every solid of canvas of front window where value of user data item "class" is "page")
set fill of thePage to solid fill
set fill color of thePage to {1, 1, 1}
set currentPageviews to value of user data item "pageviews" of thePage as number
if currentPageviews < 50 then set fill color of thePage to colorOne
if currentPageviews > 49 and currentPageviews < 500 then set fill color of thePage to colorTwo
if currentPageviews > 499 and currentPageviews < 5000 then set fill color of thePage to colorThree
if currentPageviews > 4999 and currentPageviews < 25000 then set fill color of thePage to colorFour
if currentPageviews > 24999 and currentPageviews < 50000 then set fill color of thePage to colorFive
if currentPageviews > 49999 and currentPageviews < 100000 then set fill color of thePage to colorSix
if currentPageviews > 99999 then set fill color of thePage to colorSeven
end repeat

repeat with thePage in (every solid of canvas of front window where value of user data item "class" is "noPageviews")
set fill color of thePage to {0.683673, 0.683673, 0.683673}
end repeat

7)  Replace the current script with your updated script.  Hold down “B” and click the “Pageview Bucket” button to see it in action!

8)  Repeat this process with last-modified data if you have it.  The resulting visualization will likely show you sections of the site that haven’t been updated in years, versus the parts that have seen continual updates.  If you want to add more or fewer colors to either scale, you can build on the code -- just make sure the new buckets of color you add have a class of pageviewBucket and an associated id that follows the format of the other blocks of color. 

9)  Bask.

While building the visualization does take some legwork, we’ve found that it’s usually worth the investment.  As new questions and decisions come up during the course of the project, the project team can refer back to the document and easily continue drawing conclusions.  Hope you find it helpful!

Hotfix or Not? Managing a Successful Release Process

$
0
0

It's obvious that good communication is critical to the success of any project. On a recent large project, the many layers of communication needed to coordinate some daily operations have become even more clear to me. In particular, I've become attuned to something that I never realized would be relevant to a PM: release management.

We've been working on a big, complex app for almost a year that has had up to eight front- and back-end developers working on it at once. On top of that, we've been simultaneously working on large features, small features, quick support fixes, and QA. With all that in play, it meant that once this app was in the wild, we would need a rock-solid release management plan.

The Challenges

In landing on a release process, our team first talked through the scenarios that our release process would need to support:

  • Ongoing development of large features that would need to stay isolated until completion and approval by the client
  • Frequent releases for small and medium tasks that aren’t urgent, but still require a separate QA/approval process by the client
  • Critical fixes that need to go live ASAP

It was clear that flexibility needed to be at the core of our process. We didn't want to slow down ongoing development, but needed to be able to release frequently and react quickly to emergency situations.

The Solution

We ultimately landed on a modified version of this successful git branching model to handle our releases.

Main development happens in the "master" branch. The types of changes that go directly in master are low-risk changes that likely won't need any tweaking after QA and we're confident will get a thumbs up from the client. Additionally, only complete work should go in master—a work-in-progress or tasks that still need contributions from another team member shouldn't go in master. Commits in master are automatically deployed to our internal integration environment if they pass automated testing, so the PM can always QA the most up-to-date version of master.

When everything that's new in master has been QAed and we're ready for it to be part of the next release, it's time to split off a new release candidate branch from master. We deploy this release branch to a staging environment, where the client can check it out. Any bugs discovered during this period should be fixed directly in the release branch. Once we're clear to go live, we merge the release branch into production.

If we're developing a new feature that will be worked on over time and won't be ready to launch for a while, or if it's something that will need to be touched by multiple people, that will stay in a separate feature branch. We'll deploy that feature branch to a staging environment for client QA, and it won't go in master—and subsequently into a release candidate—until it is complete and has client sign-off.

When we need to deploy something to production right away, and we can’t wait on the next release candidate to either be complete or approved, we’ll do a "hotfix." A developer will create a hotfix branch, which is created off of the production branch rather than master. Once the hotfix is complete, it gets merged into production as well as master, or the existing release candidate if there is one. Hotfixes allow us to react quickly in stop-the-presses situations.

The Process

With a lot of hands touching this code, it's important to have clear communication and established protocol around this process. Here are a couple things our team has been doing to keep this smooth:

  • We've established that all tickets go in master by default. If it shouldn't go in master, the PM should communicate on the ticket that it should remain in a separate features branch.
  • If you work on a ticket that will need contribution from another team member, you should share the name of the branch you worked in when you resolve the ticket. Now the PM knows to open a new ticket referencing this branch for the remaining work.
  • We follow a tagging convention to name releases. Release branches are named "release-X.Y.Z." "X" is the major version number—we started with 0, and bumped it up to 1 when our app moved from a private beta to a public launch. "Y" increments by one for every new release. The "Z" slot is used when it’s a hotfix.
  • Before we deploy a release candidate to staging, a developer posts the commits that are included in that release candidate in Campfire. We use that as a final check that everything on that list has been QAed. It also serves as a good log to refer back to via Campfire's transcripts. Lastly, I use that list to generate the "here's what's in this release candidate" list I'll share with this client.
    • A protip from David: We use the git command git cherry -v 0.7.2 (where 0.7.2 is the last production deploy) to generate this list.

Why Does A PM Need To Care About This?

You may be thinking that this sounds pretty dev-specific. Why would a PM need to care? I'd argue that it's critical for a PM to be fully engaged in this process, especially if you're working on a living, breathing app in a client situation.

For starters, you have insight into the client's urgency that the development team may not have. For example, a copy edit may not seem like a big deal and a developer may think that it can wait to go out with the next release candidate. But you may know the real story: it's actually related to a typo on an ad, and your client has an advertiser who's livid about the mistake. That screams HOTFIX!

You're also likely to have the best judgment when it comes to telling whether the client will want or need to QA something for an extended period of time before approving it -- which would  mean it should stay out of master. You'll know what sorts of changes the client will be sensitive to, even if it's something that may seem like a trivial change to others.

I encourage PMs to get up to speed with their development teams on how to handle releases. Take it upon yourself to understand the process. It can and should impact how you think about tasks and communicate them to your team and the client. You can help make swift and smart decisions that will make releases fluid for your team and efficient for your client.

And the best part? You get to feel like a total badass and say things like “Let’s deploy release 0.4.0 to production,” or even better, “HOTFIX IT!” Really, it doesn’t get much better than that.

A Safe Place to Fail

$
0
0

Image Credit: Blair Culbreth

This recent Huffington Post article about the culture differences associated with hiring Generation Y has gotten a lot of buzz within HR and recruiting circles lately.  However, as a company with a lot of Gen Y employees (between 56% and 90%, depending on how one defines Gen Y) which interviews an even greater number of Gen Y job applicants, we very seldom encounter the unrealistic expectations portrayed in the article.  The few instances that come to mind have occurred with the youngest folks we interact with—summer interns—and, those episodes can just as easily be attributed to youth and inexperience as the notorious unrealistic expectations HR lore attributes to Gen Y.

As a member of the Baby Boomer generation, I do sense one generational difference with those entering the workplace these days—not in terms of unrealistic expectations with respect to needing to work hard to gain experience —but, with a willingness to fail.  Raised by a no-nonsense generation whose general guidance can be summed up with the expression “no complaints, no excuses,” my colleagues and I entered the workplace in the mid-1980’s with the full understanding that we would be put in situations where we: 1) didn’t know what we were doing; 2) failed miserably, both publicly and often; 3) owned our mistakes and embarrassments as badges of honor; and 4) knew that we had to rely on ourselves to improve our situations. 

When I talk with the younger members of Gen Y these days, I often sense a real fear of failure. The message many in this generation seem to have internalized is “if I’m not sure I can do it perfectly, then I’d rather pass.”  They loathe the idea of falling on their faces.  If there’s one message I could give to those entering the workplace for the first time, it would be: it’s okay to fail.  We expect it.  You’re inexperienced—and the best way to gain experience is to try (and risk botching it).

At Viget, we’ve really tried to create a supportive environment in which it’s safe to try new things—and, yes, sometimes fail.  Some examples:

  • We host four public blogs on which staff are encouraged to post (and risk ridicule). Blog posts are not filtered through any internal approval process.  Rather, each post reflects the personality, experience, and perspective of the individual.
  • Our front-end developers host “code sacrifice” events where they invite their Viget colleagues, as well as outsiders, to discuss/review/rip apart code they’ve recently written.  The willingness to be transparent and to openly encourage discussion that facilitates learning and improvement is just one aspect of Viget culture.
  • Once a month, members of our design team share their recent work and invite critique and discussion, not just on their work -- but, on how they present their work so they can improve their client presentation skills and hone their abilities to respond to feedback.
  • Our Rails developers host “hack nights” and participate in Rails Rumbles, 48-hour programming competitions, to keep their coding chops fresh.
  • We encourage staff to explore new technologies, like Eli has done with Arduino, and new services, as David discussed with OpsWorks.  As well, through our Pointless projects, everyone has an opportunity to try on a different hat and join a team that is working to build some awesome side project.  Are you a designer dying to challenge yourself by learning to do JavaScript animations?  A Rails developer with a secret passion for user experience design?  Pointless projects provide an outlet to explore new skills or interests.

These examples help underscore the culture we have here at Viget -- but, they also represent low-risk opportunities in which you can hone your skills and then apply what you learn on client engagements.

We often hear Brian speak at our quarterly all-hands offsites about “progress, not perfection.”  I think that perspective says a lot about how we operate at Viget and about the expectations we have of the folks who work here.  Collectively (and individually), we are always in learning mode here, trying to achieve incremental improvements across the board.   We create surveys to gather internal feedback and share lessons learned about everything -- from client projects to candidate evaluations to our weekly Free Lunch Friday function.  We don’t expect perfection, nor are we striving for such an unattainable goal.  We just always think we can do a little better.

Launching Vitae with The Chronicle of Higher Education: An Online Career Hub for Academics

$
0
0

Professional growth and sharing is a big part of our culture at Viget — with each other and the larger web community. And whether it’s blogging, joining a panel at a local event, or planning a meetup with industry peers, we know the value of building a network within your professional community.

So when The Chronicle of Higher Education approached us about building Vitae, a product to help members of the higher education community grow their careers, we were excited to dive right in.

A Unique Challenge for a Unique Audience

Founded in the 1960s, The Chronicle has been the top name for news, advice, and jobs in academia for decades. When they took their publication online in 1993, they were one of the first newspapers to make the leap to the Internet. So it’s no surprise that The Chronicle was first in line to innovate again — this time to solve the unique and long-standing career challenges academic professionals face.

In recent years, members of other industries, especially our own web/tech industry, have come to count on various social networking sites to help find jobs, connect with other professionals, showcase achievements, and build careers. But what we consider “the usual channels”—LinkedIn, Facebook, Twitter, and others—just don’t have the flexibility and features to show academics at their best in an online, professional environment.

So we set out to create an online career hub to meet the unique needs of this community. For the past year, we've embodied all-things-academia to understand higher education career paths and the complex academic hiring cycle. An application built in Ruby on Rails, we designed and developed a robust platform that focuses on functionality key to this audience.

A Flexible System

An academic career can span the course of a lifetime and, until now, there hasn't been a digital platform to showcase a full body of work. Vitae's profile system is flexible enough to fit the diverse members of the higher education community. Whether a first-year graduate student, a college dean, or a university professor, users can tailor their profiles to meet their needs. A research-focused faculty member may prefer profile sections like 'Publications' and 'Research', while a teaching-focused professor may choose 'Courses' and 'Teaching' to feature on their profile.

We knew this community was busy, and wanted to avoid "another profile" that was time consuming to populate and maintain. If users don't want to fill out a profile by hand, they can import data from LinkedIn or their CV. And entering large volumes of new data—like long lists of publications—is seamless.  

Building Relationships... and Careers

Vitae is not just another quasi-social, quasi-professional online network. Vitae uniquely reflects the networking culture that’s already in place throughout academia. It prompts users to identify that they know each other and how, which makes for more discriminating choices than are typical of other online networks.

In addition to connecting with colleagues, members can access and share the latest news about academic careers written by The Chronicle's editorial team and thought-leaders in higher ed, known as Vitae Voices.

When members enter the (ever more competitive) academic job market, they can search thousands of jobs by multiple metrics, save them for later, and get notified when there are new jobs that match their interests. There's also an easy way to manage the hundreds of documents that make up an academic dossier, keeping them handy for job applications. And when they find the job they're looking for, they can manage the entire application and submission process — all within in Vitae.

This is just a glimpse into the opportunities that came from working with The Chronicle on an innovative product. We’re excited to see Vitae help academics get ahead in their careers, and become a staple part of their culture.

For more details, check out our full Case Story here. Today, we’re thrilled to share the results of all our work and present chroniclevitae.com.


Purpose, Then Product: A Place For Emotional Design in Ecommerce

$
0
0

Purpose and product - ecommerce design

Look at any really successful product brand and you’ll see that they build their brand around a purpose, not a product. Our decision making is largely guided by our primitive brain, and that’s what we tap into when we design for emotional response. We know people don’t just buy a product because it functions the way they need it to. They buy it because it makes them feel a certain way. We often buy one brand instead of another simply because our gut tells us one feels “right”. Although we’d all love to believe we’re rational shoppers, in truth our purchases rarely have to do with specs, ingredients, or even efficacy.

Like everything else it does, our primitive brain starts deciding on a brand’s purpose immediately and unconsciously. In just a few split seconds those first impressions get cemented somewhere deep inside our minds. So sending out the right visual cues to instantly communicate a brand’s true purpose to the primitive brain is crucial.

 

Purpose is part of why working with Lansinoh was so much fun for me. They’re a purpose-driven company, and all of their feedback and input reinforced their commitment to being more than just a company that sells products. They get it right by sticking to what Simon Sinek calls the “golden circle” - they focus on the why (to support breastfeeding moms) and let the what and how follow with product lines that do just that. So when it came to their website, they wanted to make sure that products were balanced with content and messaging that supported new moms. They understood the importance of emotional design in an ecommerce site.

I know Lansinoh’s target audience very well; as a mom myself, and it wasn’t so long ago that I was seeking out their iconic purple boxes on the store shelves. Most moms recognize their packaging right away... and most dad’s have probably frantically scanned the aisle for it at some ungodly hour. Lansinoh is a brand with over 30 years of history to leverage, so I knew I would have a lot to work with. But I also knew that the existing online brand, which leaned very heavily on their signature purple, would have to stretch to fit two newly acquired product lines (mOmma and Earth Friendly Baby) that had their own distinct look and feel.

We started by evaluating the five sites we’d be merging. FIVE. Did I mention it was five? Three brand sites, an ecommerce site, and a site for breastfeeding professionals all rolled into one. That’s a lot of combining!

The look and feel across those five sites varied pretty greatly. On the old Lansinoh.com, heavy use of purple felt dated and boxed in their online brand. We completely understood their attachment to purple, but also recognized that online we weren’t competing for a user’s attention in quite the same way. They weren’t scanning the shelves for that familiar purple box; we had their eyeballs already, so we could lighten it up to let the content breath and better accommodate the new product lines.

Conceptually, the old designs weren’t as strong as their content. Despite housing some great resource content, they felt product-heavy and used very few lifestyle photographs. They lacked emotional impact. We knew this was something we could fix as well.

Finding An Honest Voice

I was full of ideas for the redesign, but all of them basically boiled down to this: the site should convey the sense of intimacy that comes with new motherhood, and it should be honest. If exhausted moms were searching the site at 4am with a newborn in the crook of one arm, I didn’t want their first impression to be perfectly coiffed model moms in stain-free white tank tops holding angelic babies in matching white onesies. That commonly seen, highly sterilized vision of motherhood feels way off to me, and a quick poll of other moms told me I wasn’t alone. Maybe we didn’t have to show the whole truth about new motherhood, because there’s nothing warm and fuzzy about day-old baby puke on your shoulder... but some of the truth? Some candidness, some reality, and a few faces who were naturally beautiful but not model-perfect? That much we could do.

Lansinoh.com Sample Mood Board

We started tackling this idea in the mood board phase, getting a sense for how comfortable everyone was with the “real deal” approach. We explored different levels of honesty, including things like photos with drool on a mom’s shoulder and a test-the-waters breast icon. Not everything made it past the exploration phase, but playing with both tone and communication style early in the game helped us all sort out the right voice for the site and happily, everyone agreed that “honest best friend” was the right route.

Solving Bad Stock Photo Syndrome

Once we nailed down the overall tone and concept for the site, we were off and running. To quickly communicate Lansinoh’s brand and purpose I knew I wanted big, beautiful, engaging photos that would feel welcoming to new moms. Close-ups of babies and kids photographed the way you might see them if you were the one caring for them. The ideal photos would include mom or dad nurturing, but they would not be the focus. We’re wired to feel an immediate emotional connection to a cute baby’s face; when I added in an adult face I found it broke the spell a bit.

Lansinoh didn’t have lifestyle imagery to go with their products, so early on Jackson and I played around with various design solutions that tried to balance product shots with stock lifestyle imagery. And even though I searched long and hard to find stock photos that didn’t feel, well, stock, the pieces felt like two different worlds I just couldn’t connect. I pushed things around and around...

Lansinoh, Work in Progress #1

Lansinoh, Work in Progress #2

Lansinoh, Work in Progress #3

 

It never felt right. Eventually, I talked to my team about the gap, and we started throwing around the idea of a photo shoot. There wasn’t room for it in the existing budget, but we kept it as a “nice to have” for future client conversations. We moved forward with a layout that combined an image with a product, but wasn't quite as heavy handed with the product line. We were getting somewhere, but the need for lifestyle images was clear. How can we show a baby being fed without the beautifully designed mOmma spoon in the photo?!

Lansinoh work in progress #4

Lansinoh saw the need for lifestyle product images and identified with the struggle to find “right fit” photos. Given the specificity of the subject matter, getting the right mix of babies with the right emotional tone was difficult, if not impossible, without a significant investment in higher-end Rights Managed photos. And with stock photos you always run the risk of seeing the same image turn up on your competitor’s website! So we talked through the options, including the idea of a custom photo shoot. It was an added expense, but one that was comparable to choosing generic Rights Managed images. And custom photos would do much more to strengthen the branding. We could show the products being used and bring in hints of the Lansinoh color palette to reinforce the brand.

After some estimating and a few chats with our sales team, we got great news - Lansinoh had signed on for a custom photo shoot with our very own Zach Robbins. They knew enough moms and babies to fill our needs for models, and were onboard with our ideas for more honest and intimate images. Which meant... time to start prepping.

Knowing we only had one day to get the photos and a bunch of tiny, wiggling, impatient models with short attention spans to capture, we storyboarded each shot so everyone was on the same page. Zach, Heather and I worked together to come up with a detailed plan for the day. We blocked out an hour for each major shot, which gave us plenty of squish time in-between to capture extra images and candid moments that might come up. I brought my laptop and tested out shots as we collected them to make sure they would work in our layouts, which proved to be extremely helpful. The day was a huge success - the babies (and moms!) were very cooperative, Zach nailed the shots, and we left with exactly what we were hoping for. We were subbing our photos into comps the very next day.

Photo Shoot Results

 

The Result

In the end, the site we launched was both highly usable and emotionaly engaging. It's modern, warm, and feels every bit a part of the Lansinoh family. The feature area photos tied everything together and brought the color palette to life. We pushed boundaries where it made sense, but always stayed true to the Lansinoh brand. And, oh yeah, we totally got to hold all the cute babies in these photos. How lucky are we?

Lansinoh Breastfeeding Category Landing Page

Lansinoh responsive

Using Google Analytics API - I got the Magic Script

$
0
0

Simply put, Google’s Magic Script makes accessing Google Analytics from within Google Spreadsheets as easy as falling off a log. If you look at web analytics often, do yourself a favor by familiarizing yourself with this Google Analytics solution, and be sure to try the templates shared later in the article. In this post, we’ll go over the benefits of Magic Script, how it works, and how you can use it to make your life easier.

Providing Insights Should Be Your Priority
Why is this such a big deal? Your priority should be analyzing and gaining insights from your data, not capturing and reporting it. The more efficiently you gather data, the better prepared you’ll be to glean insights and drive actual change. At the same time, you’ll look smart and save time - a win-win! Magic Script costs nothing and is well-suited for plenty of uses, including:

  • Data or charts that need to be shared publicly or internally. Google Spreadsheets gives you access to interactive charts and tables, which you can share with others. These can be posted anywhere online, even an internal site or wiki page. This gives you control over what data is released and lets you update external charts from within the spreadsheet. Below is a quick example of an embedded chart:
  • Periodic reports, such as those run on a monthly or quarterly basis. The Google Analytics browser interface is simple to use, but you may still find yourself pulling the same reports and pasting the same data points into spreadsheets or emails, which can eat up hours every cycle. With Magic Script, you can update all of your reports with one click (or no clicks if you use triggers!) and move on to more important tasks.
  • Quick snapshots - a set of metrics that you review every day. If you find yourself looking at the same reports every day but want to save time, you can use Google Spreadsheets to lighten your load. Setting up a daily dashboard to monitor activity saves you from running reports and shows data from multiple reports in one view.
  • Automatically updated reports. Need to have reports update without you there? Google Spreadsheets allows you to set “triggers” or instructions on when to run. For example, you can set reports to run every Monday morning, so you can start your week analyzing data, not organizing it.


How Does The Magic Work?
Now that you’re convinced this is the solution for you, let’s look at what Magic Script does in more detail. In the GA browser interface, you often pull data from disparate reports with various filters and segments applied. Magic Script let's you pull the exact information you need into a single view in a way that's more flexible than GA's built-in dashboards or custom reports. The API normally requires some programming knowledge, but this solution allows lay-users to easily take advantage of its capabilities.

If you feel like you’ve read about the Google Analytics Core Reporting API on our blog before, you’re not going crazy. Mitch recently wrote a post on another Google Spreadsheets plugin - the Thuneberg GA Data Fetch script. However, Magic Script differs from Thuneberg and other API access methods. There are a few reasons why you might want to use Magic Script instead of other Core Reporting API methods: For example:

  • This Google Spreadsheets plugin was released by Google and built by Googler Nick Mihailovski.
  • Security is based on the permissions of your specific Google account, so there’s no need to enter any login information anywhere on the spreadsheet.
  • Data sampling indication is available in reports - so you’ll know whenever you’re working with controlled data.
  • It’s free!

One potential drawback, however, is the work needed to make additional reports. There are some cases where a more flexible solution may be more appropriate.

Try These Dashboards
In order to make Magic Script even easier, we’ve created a couple of dashboards to give you a head start. If you’re familiar with Magic Script (and dimensions & metrics and GA API formatting), you’ll notice that creating reports should be more intuitive with the help of spreadsheet functions. For those who want to create customized reports from this template, I would suggest reading the “Notes” sheet.

Periodic Reporting Dashboard

One-TIme Reporting Dashboard


Setting Up Dashboards
While you can certainly build your own reports via Google’s instructions, here are the basic steps to set up one of our dashboards:

  1. Make a copy of the dashboard version you’d like, rename, and open the copy.
  2. Turn on access to the APIs for your first time. Looking at the tabs on top, Script Editor > Resources > Use Google APIs > Make sure Google Analytics API is on.
  3. Script Editor > Resources > Use Google APIs > Click “Google APIs Console” > Turn Analytics API On > Agree to Terms.
  4. Go to the “gaconfig” sheet and enter the profile number for a view you have access to. You can find the profile number either by:
    • Going to the Google Analytics tab and choosing “Find profile / ids”
    • Bringing up a report in the Google Analytics interface in that specific view and copying the number from the URL after the letter p.
  5. Go to “Dashboard” sheet and run “Get Data” from the Google Analytics tab.
  6. This app needs authorization to run Yes > Accept.
  7. Good, now you never have to authorize or turn on APIs for this sheet again. Run “Get Data” once more.
  8. Your script should run and let you know everything went well and...boom! There’s data!

At first glance, the sheets might be a bit overwhelming, but stick with me for a bit. Your control board is “gaconfig,” where you’ll enter all of the report details before running. Start with any basic information such as your profile/view number and any date ranges you’d like to look at. All other query parameters should be familiar if you’ve run any sort of Google Analytics report in the past, but for more details, check out the “Notes” sheet. “RawDataSheet” is simply an aggregation of all reports run (so feel free to hide this guy). “Dashboard” is where you’ll find all of your reports in a simpler form.

GA Magic Script API

Note: Sometimes the script will look like it’s still running, even when it’s finished. If it’s been longer than 2 minutes, try to close and reopen the spreadsheet.

A big thanks to Google and Nick Mihailovski for a great integration between products. We’ve found a number of different uses for Magic Script and the Google Analytics API and hope you’ve learned a bit more about web analytics from our work. Finally, feel free to share anything you’ve improved, found, or created with the Google Analytics API!

A Movember to Remember

$
0
0

Movember

Around town, awkward moustaches are sprouting up all over the place. Your social feed shows both men and women alike on maniacal, moustachioed missions with everything from the real thing to fingerstaches and sunstaches. Is this some sort of new Thanksgiving tradition? Is this a visible warning to all the turkey birds that the grim reaper is approaching? Though I’m sure they had the most debonair facial hair, this can’t be what the Plymouth Pilgrims had in mind. What the heck is going on?

It turns out that this mad movement goes by the name of Movember and all the turkeys are really just ordinary people banding together to help raise awareness for men’s health issues and fund programs that help make progress. More specifically, the funds go to the Movember Foundation which, for the last decade, has been actively supporting campaigns to raise awareness about testicular cancer, prostate cancer, and men's mental health issues.

Brief History

This all started in Australia in 2003 when Travis Garone and Luke Slattery thunk up the idea over a few pints. It began with a question about where the art of the moustache had gone. It ended with the start of a new campaign. Inspired by a friend’s mother who was fundraising for breast cancer, the gents set off to do something similar for men’s health, with prostate cancer as their initial motivation. Today, the movement has become a worldwide phenomenon. In 2012, over one million Mo Bros and Mo Sistas together raised $147 million for the cause.

Statistics

According to Movember, here are a few interesting stats:

  • 1 in 2 men will be diagnosed with cancer in their lifetime, 1 in 3 women will be.
  • Prostate cancer is the most frequently diagnosed cancer in men after skin cancer.
  • Testicular cancer is the most common cancer in American males between the ages of 15 and 34.
  • 97% of prostate cancer cases occur in men age 50 and older.
  • Men generally have lower levels of awareness of mental illnesses than women.
  • One of the most common manifestations of mental illness is depression.
  • Over 6 million men (7% of the population) are diagnosed with depression each year.

How to Help

Friends of Viget have joined up with friends of DC’s Web Development Group to form a team to help with this worthy cause. We’re encouraging you and your friends to join our team of 20+ in support of this effort. Anyone is welcome, so please tell a friend.

Pre-mo baby faces. Viget (left). Web Development Group (right).

Also, if you’re in the Washington, DC area on November 21, 2013, we hope you’ll join us at the Gathering of the ‘Stache party organized by WDG, co-sponsored by Viget, and hosted at Canvas Co/work. Space is limited so grab your ticket today. We’ll see you there.

Just “UX” It – How To Apply UX Thinking to Real Life Projects (and Karaoke)

$
0
0

Oftentimes, I find myself starting a project at home by saying I need to “UX” something – my room redesign, my weekend plans – you name it, I’ll “UX” it. This is just my way of saying that I’m going to employ the techniques I use as a UX designer outside of work.

As UX designers, we seek to create the best possible solution for the people for whom we’re solving a problem. Our process is goal-oriented, iterative, and sees experiences as whole units, not a bunch of parts. We strive for efficiency, usability, and delight. While we use very specific tools and techniques to do our jobs, our general approach can be applied to projects that aren’t design-related.

Do you want to improve your problem solving skills? Do you complete projects and feel lukewarm about the results? If so, I highly suggest you start to “UX” things, too.

Why You Should “UX” Your Projects

There are a few reasons as to why you should “UX” things:

  • It takes the “where do I start?” out of the equation – Oftentimes, the hardest part of a project is starting it. Having a tried-and-true method you follow makes the beginning of a project feel less overwhelming.
  • It’s methodical – Though the details can be tailored, UX designers follow a process that looks pretty similar from project to project. This makes it easy to learn and improve.
  • In my experience, it works – As I've used UX methods more outside of work, I’ve found myself much happier with the outcomes of the projects I take on. Of course, no process can guarantee perfect results, but if things go wrong, design thinking provides a method for how to fix them.

How to “UX” Something

To “UX” a project, you have to understand how UX designers approach challenges. Our general process is quite pragmatic and usually follows a three-step process: discovery, iteration, and execution.

  • Step 1: Discovery – First, gather information that will give your project direction. Your research should help you define the problem you’re solving, create goals that will guide the project and measure its success, and give you a solid understanding of the needs of those for whom you’re solving the problem (note: you can be your own audience!).
  • Step 2: Iteration – Next, comes the “ideas” portion of your project. Here, you will brainstorm many broad solutions. Never settle on the first idea—it’s usually not the best one. Then, you will choose a few ideas to develop further, and refine a fully-formed solution. Your ideas should always be informed by the needs of your audience (again, this may just be your needs if you are the audience), and you should revisit your goals periodically to make sure you’re on track.
  • Step 3: Execution & Evaluation – Finally, you will release your work into the wild and evaluate how it performs. Did the solution meet your goals? Did it solve your problem? Did it create a good experience?

Now, you may be thinking “hmm, I already do research and come up with multiple ideas when I’m working on a project. How is this different from regular problem solving?” That’s a great question.

First, regular problem solving tends to be more about coming up with a solution, while a UX mindset is focused on coming up with the best solution for a specific audience. Regular problem solving does not put the same emphasis on the needs of the audience, efficiency, or overall experience. You can easily solve a problem while also creating a poor experience that makes things more complex. UX seeks to avoid this.

Next, a UX approach goes beyond the regular problem solving approach. Above, I mentioned setting and revisiting goals, brainstorming multiple ideas, and evaluating and improving a solution, all while keeping in mind the needs of your audience at every step. Sure, these techniques can be used in regular problem solving, but I think they have a tendency to get left out because of the additional time, effort, and practice they take. UX is centered around these extra steps, and I find they’re worth the effort.

UX in Real Life

Before we continue, I’d like to make a confession: I am a karaoke junkie. In fact, I love it so much I joined a karaoke league about two months ago. This means that every week, I dress up in costumes and sing pop songs as part of team, hoping to beat out our competition and have the satisfaction of knowing that I’m a celebrity for a night. There are very few things I can imagine that are nerdier.

But wait, there actually is. Every week, I try to “UX” our team’s weekly group performance to help us win. Let’s see what UX looks like when it’s applied to a real-life scenario:

  • Discovery – Our problem and goals are the same week to week: We need to pick a song to sing as a group, and we want it to be a song that the audience will enjoy, that will be fun, and that will get us a win. As for information gathering, I’ve been paying attention to what songs people like to hear, and what kinds of performances usually win.
  • Iteration – Instead of settling on the first song someone suggests, my team brainstorms a bunch of ideas. Once we’ve settled on a song, we discuss the theme and costumes for the number, with each person building on what’s already been suggested. This can lead us in interesting directions, like an Oregon Trail-themed number. At every stage, our ideas are informed by what we think the audience will like.
  • Execution & Evaluation – Then it’s performance time! Afterwards, I like to evaluate how we did. Did we win? Have fun? If not, why? What can be improved for next time?

I know it may seem over-the-top to apply UX to karaoke, but if we just fly by the seat of our pants, performances are unprepared, we lose, and people don’t have fun. A little UX has gone a long way to help us get some clutch wins! And just think, if UX can work for something as silly as a karaoke league, imagine the impact it can have on more complex projects.

How will I know if I’m doing it right?

If you’re not a UX designer, you may wonder whether this method is working once you start to use it. How will you know if you’re doing it right?

First, evaluate every project against its goals. If the solution you came up with meets your goals, and you’re feeling happy overall with the outcome, you’re probably doing things right. Second, if you are solving a problem that involves other people, there’s an easy way to figure out if your solution worked: ask them how it went. They’ll be happy to tell you what they liked and what can be improved.

If you find that things didn’t go well at first, fear not! Design thinking is a method that requires a little practice, and you’ll understand how to apply it better over time. If you keep at it, these steps will become more automatic, and you’ll be enjoying the benefits of design thinking in no time.

 

Khan Academy’s New Avatars: We Made Them, Now You Can Name Them

$
0
0

If you’ve ever wanted to name a cute avatar that’s going to be seen by thousands of people around the world, here’s your chance. We’ve been working with our friends at Khan Academy to help brainstorm, design, and illustrate avatars for the new learning dashboard that they launched in August.  As students progress through courses, earning points and gaining badges for their new skills and knowledge, they can “level up” their avatars. It’s a fun way to keep students engaged, and we were happy to lend a hand. 

When Khan Academy introduced the first version of profile avatars, students loved them—so much so that they wanted more. They petitioned the designers and garnered more than 2,900 votes, making it the most frequently requested feature.

 

 

Nearly three and half years ago, Viget helped Khan Academy with their very first branding and design explorations. So we were thrilled when their Lead Designer, Jason Rosoff, wanted to work with us again, this time to help improve their avatars (below) and come up with new and updated concepts.

 

 

We began this fun little project by thinking through several concepts centered on updated versions of “Spunky Sam” and “Purple Pi” while also considering user requests found on the forums. Our team then realized that, just as Khan Academy helps students learn and grow, so too should their avatars evolve and grow as students earn more points and progress through the learning dashboard. After several internal discussions, we landed on the idea of avatars that start as seedlings and evolve into full-fledged creatures. Here’s an early sketch:

 

 

Now that we’ve created these designs, Khan Academy needs your help and input! Use the awesome HTML/CSS submission tool and help name these cute little buggers. We can’t guarantee any prizes or fame; but, if your name is selected, know that you’ll be supporting a great organization that’s revolutionizing the way the world learns.

My Take on Phone Interviews

$
0
0

I've been doing phone interviews at Viget for six years. The phone interview happens early in our recruiting process, usually after a resume and skills review, and before any quizzes, homework, or face-to-face meetings (although, there are some exceptions). They aren't easy, but they are rewarding, and they're often the highlights of my week.

Objectives

Phone interviews at Viget help us answer five initial questions about you, the candidate:

  1. Do you come across as a professional person with ideas, opinions, and values that would fit in well at Viget?
  2. Can you clearly and confidently articulate what you're good at and what you want from your next job?
  3. Do your priorities align well with our company culture and with the position?
  4. Would the team like to talk to you?
  5. Do you seem genuinely enthusiastic about the opportunity at Viget?

Limitations

Phone interviews are not good at definitively answering these questions (and I think it's important for us to remember that we shouldn't expect them to):

  1. Are you good at the core skills of the job?
  2. Are you smart?
  3. Are you a person of integrity who is going to work hard?

The folks I work with have heard me say many times, "I love the truth." I have no problem with a 30-minute conversation that concludes with me saying, "You seem pretty great, but I don't think you're right for the job." That doesn't seem like a waste of time and, usually, it doesn't feel unkind or judgmental -- it just feels true. Nobody wants a job that they won’t be good at or that won’t let their talents shine. The hard ones are when it's not clear and the decision about whether or not to move ahead is murky. I'm grateful for my 6 years of experience, though, because I can see more clearly than I used to and I can trust my gut. 

Improvements

Over the years, I've also adjusted my approach to the phone interview. Here are some things I try to do consistently now that I didn't always do before:

  1. Offer to take the call outside of business hours when I can.
  2. Offer to do a Hangout or video call when I can.
  3. Ask very direct questions about gaps in employment history, lack of a college degree, or other topics that have me curious.
  4. Take thorough notes, capturing more than just the gist of your answers, but your phrases, specific examples, stories, and word choices.
  5. End the call with an outline of what the rest of the process will look like if we are going to move forward.

Goals

Still, there are things I wish I were better at accomplishing during these conversations. I wish I were better at:

  1. Understanding the nuances of technical skills and interests, so that when you tell me you love working with a certain technology, I can derive a more subtle meaning from your comment.
  2. Understanding the real work behind marketing and account management positions, particularly in large organizations. I think I just don’t have sufficient first-hand experience with those environments to really appreciate what goes into various roles and responsibilities.
  3. Building rapport with a shy, reserved person. We employ a lot of introverted people who are talented and awesome; I have great rapport with many of them, but it takes time. Sometimes, on a call, I can tell that the main barrier to feeling a connection with a candidate is their temperament -- but, I struggle to know how to best respond to that barrier.
  4. Being more succinct. Sometimes I'm eager to tell you all about Viget. Hopefully, I get points for my enthusiasm -- but, I do aim to listen more and talk less on these calls.

Most candidates will not move beyond the phone interview. For those who do, the evaluation gets more rigorous, thorough, and time-consuming. I hate to burden my team or a candidate with the time commitment if I know it's not going to result in a hire. For those who don’t move past the phone call, I hope the conversation reflects well on Viget, even if the result is disappointing. I hope all candidates realize that the phone interview is one of the best parts of my job: it is a privilege to get to know (even minimally) so many talented people.

A Focus on PM Training: The Next Boulder Web Project Manager Happy Hour

$
0
0

On Thursday, November 21st, Boulder-based, web-focused Project Managers will get together for another Happy Hour. Based on a poll of what topics were most interesting to members, this meetup’s topic will focus on PM Training. We will discuss what works and what doesn’t in terms of improving on our skills as PMs.

Conferences dedicated to Web Project Management are few and far between (which might be why the recent Digital Project Management Summit had a waiting list as large as the attendee list). Communities of PMs are typically small, and there aren’t many classes focused on what we do. So, what are PMs doing to learn and get better, what can we be doing, and how can we train others to be successful PMs?

If this meetup group has taught me anything, it’s that we don’t need to figure everything out alone. We are building a community together, and leveraging each others’ knowledge and experiences is a main benefit of that community.

Are you a Web Project Manager in or around Boulder? Please come and join us on November 21st!

What: BWPM Happy Hour - PM Training, what works and what doesn’t?
When: Thursday, November 21st at 6:00PM
Where: Boulder Beer Company
Why: To learn how other Web PMs continue to learn, grow, and get better.
How: RSVP on the Boulder Web PM Meetup Page

Are you interested in sponsoring a future Boulder Web PM Meetup? Let me know!

Can’t join us but have thoughts on training for Project Management? Comment below and let us know!


Creative Workshops: Think Differently

$
0
0

Last year Leo Burnett Worldwide wrote about their experience with Farmhouse Lab during Chicago Ideas Week. They assembled a group of innovators and big thinkers for a workshop titled Reinventing the Everyday Object. Their goal was to consider how they could take ordinary, everyday objects and improve them to serve a more useful purpose.

Workshops  can be a great way to break out of your day-to-day routine and think about something from a fresh perspective. They also have benefits like team building, learning how to generate a lot of ideas quickly while honing in on the best one, building on the ideas of others, and thinking on your toes. So I assembled some of my fellow Viget team members for our own workshop. We used the same premise of reinventing everyday objects but tried to stick to a one-hour timeframe. By the end of the hour, we had some brilliant ideas floating around the office, including:

  • a clothes hanger with LED lights on the top that could indicate things like size and color, or even how long it’s been since you wore something
  • a backpack umbrella that attaches to bike handlebars covering the rider like a tent and retracts when you’re done
  • a streamlined smartphone case doubling as a wallet that can hold just enough cash and credit cards, only opens with fingerprint security, and connects to apps on the phone to handle receipts

Todd Moy sketches concept for the Creative Workshop

Start Your Own

Creative workshops like this are a great exercise to do individually or with teams. Here are a few things that might help you get started:

Rules

Keep it light. This means less overhead for you as the organizer and more room for innovation.

Time
  • Keep it to about one hour. The breakdown looks something like this:
    • 5 minutes to explain the exercise.
    • 40 minutes to brainstorm.
    • 15 minutes to present ideas.
Teams
  • If you have multiple people, teams of 2-4 work well.
  • Mix up the teams with people from different disciplines, backgrounds, and personality-types.

A few things specific to the Reinventing Everyday Objects workshop:

Premise

The premise of this workshop is to identify problems or areas of improvement for the object and design around them.

Objects
  • Have physical objects available for the teams to hold 
  • Some of the objects that we had to work with included:
    • Umbrella
    • Doorbell
    • Hanger
    • Wallet
    • Alarm clock
    • Shower Head
    • Light switch
Supplies
  • Whiteboards are helpful for quick sketching and iterating ideas.
  • Giant easel-sized Post-it paper is also helpful for sharing ideas at the end of the workshop if everyone isn’t in the same room.

Peyton presents his team's concepts during the Creative Workshop

Future Workshops

Based on suggestions from the team, here are a few things we might change or incorporate into future workshops:

  • Establish one problem that everyone works on.
  • Establish whether the end-goal is many different ideas, or one good idea.
  • Establish whether the ideas should be practical or not.
  • Structure final presentations more like a pitch, and all of the participants can decide which idea they would invest their money in for development.
  • Get everyone out of the office to observe a situation, or just to change scenery.

After the workshop, several people said that stepping away from the screen to think creatively about tangible, non-digital things, was invigorating. Others said that it was great to work with team members that they don’t usually get to work with. We all went back to work ready to tackle challenges in new, creative ways.

Try taking some time away from what you do everyday to think about something differently. If you’ve ever participated in creative workshops, or plan to organize one in the future, I’d love to hear about it.

You can check out photos of our recent workshop here.

How To Create PDFs in Rails

$
0
0

We worked with The Bill of Rights Institute recently to create an interactive digital course for American History teachers. One of the interesting challenges, among many, stemmed from the fact that the project had large sections of readable content. One of our goals was to make it easy for students and teachers to print out their reading material if and when they’re not able to read it on screen.

To make printing possible, I needed to create PDF files that were similar to the HTML content. These files needed to be both viewable in the browser and downloadable from the page the content lived on. In some cases, we wanted to selectively remove some elements from the page or apply a slightly different stylesheet for printing the content.

After a bit of research, I found two possible approaches:

  1. Generate a PDF “by hand” from source data using a tool like prawn
  2. Take a source HTML document and transform that into a PDF 

Taking the source HTML document and converting sounded ideal, because I wanted to keep similar CSS styling and layout of the page with minimal modifications. Since prawn is not an HTML to PDF generator, I investigated the following tools:

  • Prince — A command line tool that can take an HTML source file from disk and turn it into a PDF. It can read from a local file or a URL. However, it’s pretty pricey; a server license carries a one-time fee of $3800.
  • DocRaptor — Basically, this is Prince offered as a service.
  • wkhtmltopdf — A free option that uses the WebKit rendering engine within QT.

wkhtmltopdf sounded like the best option to explore since it uses a browser engine to render the page and then save as a PDF. I found two Ruby gems that use this library: PDFKit & Wicked PDF.

I initially started using PDFKit and its included middleware, and I was able to very quickly get viewable and downloadable PDFs.

I enjoyed that the necessary binary files are included with the gem for a number of operating system environments, which saves you from having to install different packages in your respective application environments (OS X vs Ubuntu).

While PDFKit worked great at first, I eventually encountered a roadblock: I needed to be able to include different stylesheets & layouts for different "types" of PDF files, which PDFKit didn’t have any mention of supporting. I was also struggling to get asset paths working correctly on Heroku. The PDF generation actually happens in a separate process, so I somehow needed to use absolute URLs for paths to all assets.

After a bit of searching, I found the excellent Wicked PDF gem.

Wicked PDF

Wicked PDF doesn't package the binaries in the main gem, but it's simple to include the binaries that you need (you can grab from PDFKit gem) in your bin/ directory and set up Wicked PDF like:

platform = RUBY_PLATFORM 

if platform.include?("darwin") # OS X machine 
  binary_path = Rails.root.join('bin', 'wkhtmltopdf-0.9.9-OS-X-i386').to_s 
elsif platform.include?("64-linux") # 64-bit linux machine 
  binary_path = Rails.root.join('bin', 'wkhtmltopdf-amd64').to_s 
end 

WickedPdf.config = { :exe_path => binary_path }

Wicked PDF also has the optional middleware, but I decided to not use it so that I could have more fine-grained control over where PDF files can be accessed and specifying their layout and template for each "type."

Viewing PDFs in the browser:

respond_to do |format|
  format.pdf do
    render :pdf    => "my_pdf_name.pdf",
      :disposition => "inline",
      :template    => "controller_name/show.pdf.erb",
      :layout      => "pdf_layout.html"
  end 

  format.html
end

Downloading PDFs as a file:

def download 
  html = render_to_string(:action => :show, :layout => "pdf_layout.html") 
  pdf = WickedPdf.new.pdf_from_string(html) 

  send_data(pdf, 
    :filename    => "my_pdf_name.pdf", 
    :disposition => 'attachment') 
end

Wicked PDF also includes examples and a handy helper method for specifying assets and substituting them inline into the HTML document: <%= wicked_pdf_stylesheet_link_tag "my_styles" %> & <%= wicked_pdf_javascript_include_tag "my_scripts" %>

It also allows for easily debugging the PDF page by viewing it as an HTML page. You can do this by using the described option:

:show_as_html => params[:debug].present?

This allows you to simply add a ?debug=true to the end of your path. Example: http://example.com/my_pdf_name.pdf?debug=true

Ultimately, I found Wicked PDF to be the best choice due to: ease of setup, ease of using different layouts and assets for PDFs, and excellent documentation and examples. Some of the examples included how to use assets on Heroku, assets from a CDN, using the asset helper methods, and how to generate and download files using send_file.

Have you worked on a similar project? Any input on best solutions? Let us know in the comments below.

EarthEcho Gets a New Site for a New Adventure

$
0
0

We’ve been a long-standing friend and proud creator of EarthEcho International and Philippe Cousteau’s sites, so when Philippe called us up to chat about EarthEcho’s new initiative, we were excited to get onboard. EarthEcho International’s latest venture, EarthEcho Expeditions, takes the rich Cousteau legacy of discovery one step further. It brings students from around the world together on a journey to explore aquatic hotspots and inspires them to serve as the next generation of environmentalists.

EarthEcho International was founded by Philippe Cousteau, alongside his mother Jan and sister Alexandra in honor of their father Philippe Cousteau Sr., famous son of the legendary explorer Jacques Yves Cousteau. Philippe first came to us with the intention of adding EarthEcho Expeditions to their existing site as a new program. After we dove into the details of the program and how it was going to change the face of EarthEcho International, however, we learned that a not-so-simple addition would add clutter or worse, get lost. Either way, it just wouldn’t do the program justice.

We went back to the drawing board. We had to figure out how to fold the new Expeditions focus in with the evergreen teacher resources in a way that balanced the unique differences of each while also leveraging the shared assets among them. We also needed a way to artfully incorporate the new Expeditions branding, with its logo and bright colors, with the rich colors and earthy textures of EarthEcho’s core brand.

Most important, however, was the need to simplify the site architecture and reduce the bloat that had collected from years of adding on now-outdated programs and resources. We wanted to allow the Expeditions to take center stage. Such simplification would be critical as we made the site responsive, an absolute requirement for any site targeted toward middle and high school students who are constantly online and on-the-go, as well as teachers who plan to use these resources in the classroom.

Our final result was a site that's open, clean, ultra-focused, and easy to navigate (and even easier to update on the admin side). Every page has a lot to offer, but our favorite part of the site is the Expedition page itself. It features an interactive map that marks the expedition trail and describes each location. The custom colored Google Map was fun to produce and enables EarthEcho to easily add future Expeditions -- they don’t have to rely on us to design and build a custom map image.

You can either scroll down or jump directly from the map to the updates grid below. The fluid grid, complete with seven filterable content types across each day of the expedition, is filled with slick transitions and interactions that encourage teachers and students to explore more.

Last but not least, the site’s new responsive design makes it easy to access anywhere, on any device. For teachers and classrooms, this means a beautiful, tablet-based experience, and EarthEcho’s rich content isn’t sacrificed on mobile devices.

It’s been just over a month since launch and we’ve already seen a flurry of registrations and a lift in traffic. We’re excited to have been a part of this new chapter in EarthEcho’s life and we look forward to future enhancements as EarthEcho Expeditions improves and grows. We’re certain this new initiative, and the new website, will help expand EarthEcho’s mission and inspire more youth to get involved and act as environmental change leaders.
 

Designing Device Assets: Templates and Tips

$
0
0

In olden times, as long as you remembered to create a favicon you were golden. Now, between phones, tablets, retina screens, iOS 7, Windows 8, everything and the kitchen sink, there are literally dozens of icons, images, and assets a designer can create for a project. It can be overwhelming to keep track of all the variations and each of their individual quirks. And if you're me, it can be daunting even to remember to design all of these assets in the first place. I'd like to think that when I hand off a design to a front-end developer, I've crossed every T and named every Photoshop layer. And yet every time, every time, ten minutes after handing over a project, Jeremy asks me for the Apple touch icons, which I have invariably forgotten.

To end this vicious cyle of forgetfulness and confusion, I have created a set of Photoshop templates for device assets. These templates cover, as completely as possible, favicons, Apple touch icons, startup images for iPads and iPhones old and new, and Windows tiles for IE10 and IE11. 

Download Template Set .Zip

Favicons

Good old favicons. In order to have the classic 16x16 px image ready for retina, you now need a 32x32 px version as well. You can save a 32x32 px .ico file straight from Photoshop (with help from Telegraphics's plugin) and let non-retina browsers do the work shrinking it down for you. Or for the pixel perfectionist, you can create 16x16 and 32x32 .pngs and use a tool to combine them into one .ico file. Check out the links below for a good run-down of the best tools for the job. This template set includes both 16x16 and 32x32.

Further favicon resources

 

Apple Touch Icons

Web apps, responsive sites, anything that a user might want to save to their Apple device's homescreen needs an Apple touch icon. Between iPads, iPhones, iOS 7, and retina screens, there are several variations to account for, the largest being 152x152 px. This template set includes a whopping seven templates to cover everything you could possibly need. Taylor Fausak has a fantastic, clear breakdown of how each size correlates to a different device: 

  • 152x152 for retina iPads on iOS 7.
  • 144x144 for retina iPads on iOS 6.
  • 120x120 for retina iPhones & iPod touches on iOS 7.
  • 114x114 for retina iPhones & iPod touches on iOS 6.
  • 76x76 for iPads on iOS 7.
  • 72x72 for iPads on iOS 6.
  • 60x60 for iPhones & iPod touches on iOS 7, although no such devices exist.*
  • 57x57 for iPhones & iPod touches on iOS 6.

*I'm not including a template for a non-existent combination of device and operating system. Even I have my limits.

Remember, Apple automatically rounds the corners of your icon for you, so you don't have to do a thing. iOS 7 doesn't apply any additional styles to your icon. With earlier iOS versions, you can choose whether or not to have Apple apply their good old glossy shine on top of your image. 

Apple Startup Screens

Now that a user can fire up your site straight from their homescreen, it's nice to have a startup image while it loads. Again, the differences between devices, operating systems, and screens means lots of variations! iPads can have startup screens in either landscape or portrait mode. Meanwhile, iPhones can only use a startup screen in portrait. The fun part about landscape iPad startup screens, as you can see when you crack open the templates, is that the image itself needs to be rotated 90 degrees. Here's the breakdown of templates included:

  • 1536x2008 for retina iPads in Portrait mode
  • 1496x2048 for retina iPads in Landscape mode
  • 768x1004 for iPads in Portrait mode
  • 748x1024 for iPads in Landscape mode
  • 640x1096 for retina iPhone 5
  • 640x920 for retina iPhones pre-iPhone 5
  • 320x460 for ye olde iPhones
Further iOS resources

Windows 8 Tiles

We're still not finished! Windows 7 simply used your 32px favicon to pin sites. But Windows 8 caught up with the icon craze and needs its own, unique assets. IE10 tiles are a little limited and tricky. The tile itself is 150x150 with a solid color background and a site's title in text towards the bottom of the tile. So where does your icon come in? It's an 80x80 square PNG in the center of the tile. Its actual dimensions are 144x144 to account for high resolution screens and devices. (Nope, your math isn't wrong. Now that you're used to doing everything 2x for retina, Windows has its own spin: do everything at 1.8x.) Since your image won't cover the entire tile, the best practice is to make it transparent. At least you can control the solid color it will sit on.

IE11 is a bit more flexible. Very flexible, in fact. You have full control of the tile's look; you also have a whole family of tile dimensions to design. There is:

  • 128x128 for small square tiles
  • 270x270 for medium square tiles
  • 558x558 for large square tiles
  • 558x270 for wide rectangle tiles

Again, these assets are 1.8x their display dimensions. So they should look good when displayed at 70x70, 150x150, 310x310, and 310x150, respectively. Windows again can insert a text title at the bottom of the tile; keep the bottom 36 pixels clear of any noisy imagery that could affect legibility. Windows decides whether white or black text will be more legible against your design, so don't necessarily count on the text being the color you'd prefer.

Further Windows 8 resources

Phew!

I'm sure you could go even deeper down this rabbit hole if you wanted. And this brand new template set is probably somehow already outdated. But hopefully it's useful for you and your process. Make sure to check out the resources I link to for top-notch insights into the implementation that goes along with these assets. 

Download Template Set .Zip

Behind the Design: Khan Academy Avatars

$
0
0

About two months ago, Khan Academy asked us to create avatars for the new Learning Dashboard which launched in August. We thought it would be a fun experience for students to see their avatars evolve into advanced creatures as they progress through their online courses. This was an exciting opportunity for us to showcase our skills in character design and illustration.

DEFINING OBJECTIVES

We wanted the avatar designs to be simple, charming, and unique. Since these illustrations were for an educational platform, we strayed away from making anything too crazy like a monster from Digimon or Yu-Gi-Oh!. In the past, Khan Academy users preferred more androgynous creature-like avatars over the other robotic avatars. Therefore, we decided to make our designs more like simple monsters, dinosaurs, and animals. As the avatars evolve, their shapes and forms would feature different character traits to give the students something to look forward to when they advance to the next level. Each avatar would have a specific color and element to make it more memorable and distinguishable. To tie in with the leaf in the Khan Academy logo, we decided that the avatars would start as seedlings and later evolve into developed creatures.

COLLABORATIVE SKETCHING

Minh and I started sketching avatars individually on paper without sharing our work until we were satisfied with our concepts. Then we took our ideas to the whiteboard for close collaborative sketching—this process felt more organic and transparent because we were drawing on the same plane, which allowed for instantaneous peer feedback. If I needed help with an idea, Minh would simply draw his version of an avatar next to my initial sketch. Sometimes it was a back and forth process of drawing on each others’ sketches. If we weren’t happy with an avatar’s characteristic, we would draw it over and over again in different variations until we were satisfied.

 

AVATAR CHARACTERISTICS

We were intentional about designing the characteristics of the avatars to create variety for the students to choose from. As the creatures evolve, their unique features transform in scale, quantity, shape, and position. For example, the green avatar’s leaf on top of its head would grow larger and multiply as it evolves. In the early stages, its body starts out as very round with short limbs. As the green avatar evolves, its limbs grow larger with new joints and eventually ends up in standing position like a human to convey higher intelligence.

We scaled up the heads of early avatar stages to convey youth and double down on cuteness. Also, we kept the eyes small to let the other body features define the avatar’s essence. Big eyes with extra details would have made the creatures overly cartoony, and would not have translated well into an education platform.

Each avatar had to embody an elemental force such as fire, water, vegetation, electricity, darkness, and telepathy. Sometimes we chose the avatar’s element before drawing its form, and at other times we assigned an avatar’s element after drawing the basic form of the creature.

 

VECTORIZE

Minh and I took pictures of the refined sketches and then traced the photos in Illustrator. We primarily used the Pen tool and Pathfinder palette to create individual avatar characteristics. Also, we were aiming to create curvilinear lines and shapes that were smooth and without excess anchor points to convey a clean look. To keep things simple and consistent, we initially used only two shades of the same color for each avatar. Then we added one or two extra shades of color to each avatar to define shadows, highlights, and other details.

CONCLUSION

We’re really happy with the final product and had a ton of fun making them. I look forward to more collaborative sketching; it made brainstorming ideas and executing on them smooth and harmonious. We hope that the Khan Academy avatars’ fresh look and “leveling-up” concept will enhance the users’ experience by giving students a fun way to personify their learning success.

You can help Khan Academy name these new avatars by using this wicked HTML/CSS submission tool

Viewing all 1271 articles
Browse latest View live