What they are
Have you ever implemented a new process that did not go as smoothly as you would have liked? Have you ever had a new process thrust upon you that had unforeseen problems? If the answer to either of these is “yes,” there is a high likelihood that the new process was not thoroughly tested or not tested at all. Project pilots are tests or experiments conducted to determine if a new process works as designed and to identify where adjustments can be made prior to full implementation. Some key points:
- While pilots will not guarantee that you will not run into some issues, they are nearly a foolproof way of avoiding major problems before a new process is implemented.
- When pilots are not conducted and a process does not work well, it creates distrust from those who will be instrumental in ensuring the process is implemented and used. That is often a recipe for failure.
- Additionally, if a new process is not tested and does not work well, itmay lead to the conclusion that “this was a stupid idea from the beginning.”Maybe the entire process is awful, but that is not usually the case. Tossing everything out, or “throwing the baby out with the bath water,” is a bad practice and a waste of valuable time.
How to do them
Let’s assume you are trying to implement an electronic, school/college-wide process for tracking staff presentations at conferences.
- Design and map. If your process relies on IT and technology, bring IT into the conversation. With the example, IT would need to be involved; they may need to build this tracking system or implement one that has been purchased.
- Scope the pilot. One approach is to work with individuals, offices, or units who are already involved with your process change. They are invested and want to see success. With the example, if you have been working with a few individuals who represent select departments within the school/college, pilot with those departments. If the change/new process is only going to impact a few people or offices, you may want to pilot with all of those people.
- Set a time frame. This is dependent on the complexity of the change. A week may suffice, but more time may be needed.
- Assess. You must have a way to assess “what works” and “what needs to be adjusted.” These are known as process metrics–does the process work as designed? For the example: Are all presentations ending up in the tracking system? One way to approach this would be to survey staff to identify all presentations, then measure that against what was collected during the pilot. Yes, this is collecting the same information through two different methods, but this is just a test. This will not be the practice moving forward. You just want to pin point the good, the bad, and the ugly.
- Analysis. For the example, you found that 75% of presentations did end up in the tracking system. So, 75% of the time, the process is working, but what happened with the other 25%? To understand that 25%, you might just ask those individuals directly. This might be a few responses:
- “I was on leave during the pilot time-frame, but would have entered my information.
- “I couldn’t get logged into the system to track my presentations.”
- “I wasn’t actually sure what counted and what did not, so I didn’t enter anything.”
- “This entire tracking thing is stupid – I don’t care.”
- Adjust. Once you identify what worked and what did not work, make the necessary adjustments to ensure that the process is working as smoothly as possible. For the example above, each of those responses given would require different solutions. The first is really a non-issue, the second is technological, possibly educational in nature, the third involves communicating clearly about inclusion/exclusion criteria, and the last one would require communicating why this is important.
- Communicate change. When you ready to fully implement the process change, make sure you communicate (hyperlink to Communication Plans) this to all of your stakeholders.