Industry Benchmarks About Hours Worked Per Week

  1. Posted on September 10, 2007 8:44:AM by Steve McConnell to 10x Software Development
  2. Methods & Processes, metrics, Management

One of my readers asked the following very reasonable question: 

We are looking for industry benchmarks detailing the amount of time developers spend on a percentage basis in the following three categories:

1) Core job activities (writing, testing, deploying code, etc.)
2) Meetings
3) Administrative activities (training, reporting, etc.)

The questions are reasonable. Unfortunately, one of the lessons I've learned after looking at lots of data on questions like this is that sometimes reasonable questions don't have reasonable answers!

In this case, what I would call "project focused hours" per month can easily vary by a factor of two between different companies based on factors like how much time is spent in meetings, how long the work days are (think government job vs. internet startup), number of holidays, number of training days, number of non-project meetings, level of support required for software already in production, etc. A common "big company" planning number is 6 hours of project-focused work per day, for the days that the employee is actually at work, but that can vary a lot across big companies and even within big companies. Based on what we see in our consulting practice, I think it's rare to average 6 hours per day of truly project-focused work in a non-startup company. The most common distraction from project-focused work we see is time spent supporting prior releases that are in production.

The number of meetings varies a lot too and is significantly affected by company culture. When I was at Microsoft in 1990-91 I probably spent less than 5 hours a week in meetings. In contrast, I had a former Microsoft employee tell me earlier this year that on the team he was on he was booked in meetings from 10:00-4:00 5 days a week. Lots of managers at other companies have told me that they're in meetings all day every day and get most of their "real work" done during evenings and weekends, so obviously there's a big difference between Microsoft 1990 and Microsoft 2007, and among different companies.

The amount of training, reporting, etc. varies just as much--it varies even more on a percentage basis. Best in class companies typically devote 8-12 days per year to training, whereas many companies we see allow technical staff to take 1 class per year. Many of the companies we see don't systematically support any training days per year. 

Bottom line is that there's just too much variation among companies to make meaningful statements about "benchmark" allocations to work and overhead time categories. That doesn't mean that you won't find published sources that claim to be benchmarks, but if you do those sources are usually limited by the fact that the authors haven't had exposure to enough companies to realize that there's as much variation as there is.

Resources

Bo Peaslee said:

September 10, 2007 1:01:PM

Seems to me someone ought to find a way to "normalize" this data.  The basis for normalization could be company size in number of employees, IS department (domain based) size in number of employees or number of developers, etc.  Developers or IS staffers would have to be limited to testers, programmers, architects, etc.   I wouldn"t include project management, marketing, or management above team managers.

Pawel Brodzinski said:

September 11, 2007 7:22:AM

Besides organization size and software development/project management methodologies used it is important which phase the team is exercising at the moment.

A couple of examples:

1. I worked once in a team which was focused on software product (project management and implementations made in different team) and we had four phases which were rather clearly separated: vision, design, development and stabilization. During first two phases there were a lot of brainstorming and design meetings with developers. A few most experienced programmers were spending between one third and a half of their time on meetings. During development and stabilization phases it was completely different. Except weekly team meetings (less than 3 hours weekly) almost no other events.

2. In small company delivering middle-size projects for carrier-grade customers things are put upside down when the really big project is sold and more than a half of company is engaged in the project. With tight schedule you need regular multi-hour meetings to spread information among the teams, analyze risks, rearrange tasks etc. With more agile than formalized approach those meetings keep everyone"s focus on the goal and help much in planning further steps. Time spent on meetings after the project was started was increased from about 2 hours to 10 hours weekly. It will drop to typical level after the most important milestones are met.

One more thing here - I think time spent on administrative activities will be fairly constant for the organization (and it will grow with the growth of a company).

In my current company developers spend less than 10% of their time on meetings and much, much less on administrative activities (excluding training which is highly customized depending on specific person"s skill set).

mukul@indiusnet.co.in said:

September 12, 2007 5:06:AM

I think Pawel made some very valid points over here. I guess phase of the project has a key role to play over here. The developers will need to spent a lot of time in meetings during the discovery and designing phase of the application.

Apart from this, the software development approach itself impacts this. For example implementing processes like CMMI-dev means that administrative tasks also take up lot of time during development phase as well.

Maksym Shostak said:

September 14, 2007 1:31:AM

Who is asking such kind of questions, and what does he want to measure?

Why the 2-nd and 3-rd category of activities are not called "core job activities"?

How about "mythical man-month"?

Post a Comment:

 
 

Steve McConnell

Steve McConnell is CEO and Chief Software Engineer at Construx Software where he consults to a broad range of industries, teaches seminars, and oversees Construx’s software development practices. In 1998, readers of Software Development magazine named Steve one of the three most influential people in the software industry along with Bill Gates and Linus Torvalds.

Steve is the author of Software Estimation: Demystifying the Black Art (2006), Code Complete (1993, 2004), Rapid Development (1996), Software Project Survival Guide (1998), and Professional Software Development (2004). His books twice won Software Development magazine's Jolt Excellence award for outstanding software development book of the year.

Steve has served as Editor in Chief of IEEE Software magazine, on the Panel of Experts of the SWEBOK project, and as Chair of the IEEE Computer Society’s Professional Practices Committee.

Steve received a Bachelor’s degree from Whitman College, graduating Magna Cum Laude, Phi Beta Kappa, and earned a Master’s degree in software engineering from Seattle University.
Contact Steve