Jump to content
View in the app

A better way to browse. Learn more.

PropLibrary

A full-screen app on your home screen with push notifications, badges and more.

To install this app on iOS and iPadOS
  1. Tap the Share icon in Safari
  2. Scroll the menu and tap Add to Home Screen.
  3. Tap Add in the top-right corner.
To install this app on Android
  1. Tap the 3-dot menu (⋮) in the top-right corner of the browser.
  2. Tap Add to Home screen or Install app.
  3. Confirm by tapping Install.

Carl Dickson

Administrators
  • Joined

  • Last visited

Everything posted by Carl Dickson

  1. To write a proposal from the customer’s perspective requires not only responding to the RFP, but also understanding how the customer will evaluate your response. How will they read it? Will they read it, or will they simply score it per their evaluation criteria? And if they do score it, what is their process? If the customer has a formal RFP evaluation process, like they do with government proposals, the RFP evaluation criteria can give you clues about their process. Are the evaluation criteria objective or subjective? Does the evaluation criteria specify point scoring? Or are they assessing strengths and weaknesses without a numerical score? These questions are where you start before you ask yourself the really important questions about how the customer will evaluate the RFP responses: What do they need to see to give a maximum score? What do they need to justify the score they give you? What will they put on their evaluation forms? Where will they find what they need in your proposal? A sample of commonly used RFP evaluation criteria Too often people fill their proposals with the things about themselves that they think sound good, hoping the customer will take action on them. Only that’s not what the customer needs to complete their RFP evaluation or even what they really care about. Sometimes companies do a little better and fill their proposals with beneficial sounding claims. But not all benefits translate into things that can be scored. If the customer likes your offer and wants to make an award to you, what words do they need to justify their decision? That is what you need to put in your proposal. And the RFP will give you clues as to what those words are. You must carefully parse the language of the evaluation criteria. How are the criteria grouped? Will it be treated as one bid item, or a bunch of specific items? How will they be labelled? What will they be giving credit for? Then ask yourself how can you organize and word your proposal so you can get the maximum score. Everything that you know about the customer, the opportunity, the competitive environment, and your own company and offering must be aligned with their scoring methods for it to impact whether you win or lose. And outside of things necessary to achieve RFP compliance, if it doesn’t impact your score, it’s probably not worth saying. If you want to say something, say it in a way that will make it onto their scoring justification forms. If you put saying things your way ahead of saying things the way the customer needs them said to process their evaluation, you will likely end up with a brilliantly eloquent proposal that loses. Along the way, you will make many discoveries, which could include things like: Will the customer be able to compare apples to apples? Will they be more focused on strict RFP compliance or have they given themselves room to use their judgment? What does the customer think is important? Price always matters, but how much in this case? Are they concerned about risk? You may also be able to detect whether the RFP is wired for someone else, like the incumbent. But don’t fool yourself. All RFPs can look wired and they almost never really are. Don’t just treat this as an exercise in presentation. There are strategic implications as well. What do you need to emphasize in order to win? What does your offering need to do or be? What strategies do you need to employ in your proposal to outscore your competitors? For formal evaluations, great proposal writing is not about finding the magic words that will hypnotize the evaluator. Instead, great proposal writing is about translating what you are offering into what the customer needs for their evaluation process in a way that maximizes your score. Everything about the pre-RFP phase of pursuit and everything that goes into your proposal needs to be done with achieving this goal in mind.
  2. The best way to write a great proposal is to get inside the mind of the evaluator and make it easy for them to reach the desired conclusions. It helps to be able to read the proposal like an evaluator. This can be challenging when you don’t know who the evaluators are. But you can still anticipate what an evaluator has to go through and how they’ll approach looking at your proposal. You might also consider the culture of the customer’s organization and the nature of what they are procuring. The answers to the questions below can vary bid by bid based on these factors. The following questions are intended as a way to consider the customer every time you are writing a proposal, and not as a one-time exercise. Is the evaluator the end user of what is being proposed? How does that impact what they’ll look for in the proposals? What guidance has been given to the evaluator regarding their assessment? Are they free to consider whatever they want or reach their own conclusions? What training have they had? What procurement policies or procedures might apply? How does that impact how they’ll read the proposals? If the evaluator is not technical, when they read a proposal do they want an explanation of the technology or an explanation of why it is the best way to get the results they are looking for? What matters about the technology to the evaluator? What should matter about the technology to them? Is the evaluator primarily concerned with the qualifications of the vendors, or how their qualifications impact their ability to deliver as promised? Will qualifications be evaluated on a pass/fail basis? How will they assess who has more or better qualifications? Will they even try? How does this impact the way you should write about your qualifications? How does the evaluator assess experience? Do they quantify it and look for the vendor who has the most? Or do they look for vendors who explain how they apply their experience to achieving better results? How do they consider relevance? Does project size or complexity matter? Do they check your references? How does this impact the way you should write about your experience? Does the evaluator want you to describe yourself, or do they want to know how those details will provide them with better results? How does this impact how you should present details about yourself? Put how you want them to think about you aside for a moment. Do vendors really matter to them? If so, what will they think matters about the vendors submitting proposals? What do the evaluators need to know about you to reach a decision in your favor? If the evaluator is not the decision maker, what do they have to do to justify their evaluation? What do they need to find in the proposals to accomplish this? What tasks does the evaluator have to perform in order to complete their evaluation? Will they have forms to fill out? Rationales to write? Comparisons to make? Checklists to complete? Scores to assign and calculate? Will the way your proposal is presented make this easier or harder? What is motivating the evaluator to take action and what does that imply? What do they care about? What gives them inspiration? How will what they see in your proposal impact their motivation and the actions they take? What must the evaluator believe to accept your recommendations? Does the proposal have to change their existing beliefs or reinforce them? What does the evaluator fear? How will this impact their assessment? What can you present to avoid having their fears negatively impact how they assess your proposal? Can their fears help you win? What are the evaluator’s aspirations and goals? How does what you are offering fulfill their goals? Can you show them how making a decision in your favor will help them achieve their goals? What if the evaluator can't find something they are looking for? Have you anticipated what they'll need to locate in your proposal to perform their evaluation? Is everything easy to find? Is the evaluator in a hurry? Does the level of detail and the way your proposal is presented support the evaluator’s pace? Will they appreciate how easy evaluating your proposal is, or will it be a time-sink and drudgery that slows them down? How many proposals does the evaluator have to consider? How much attention will they give each one? Will they be motivated to disqualify as many proposals as possible to lighten their load? Does your proposal focus their limited attention on the right things? Must the evaluator reach a decision at all? What alternatives do they have? What should you say to prevent them from selecting one of those alternatives? Does the evaluator trust you? Does what they see reinforce trust or detract from it? Do they see claims or do they see proof? When the evaluator considers price, do they also consider a company’s ability to deliver at that price? Where do they draw the line between wanting the lowest price possible and the risk of nonperformance? How does that impact your pricing and what you say about it in your proposal? When the evaluator considers price, do they also consider value? Are they willing to pay more to get something more? What do they need to justify a decision to pay more to get a better value? Must it be quantified? The evaluator is concerned about risk, because they are always concerned about risk. How do they assess risk? Are they risk tolerant when it comes to innovation? Or are they risk averse? Which risks are they the most (or least) concerned about? What do they consider sufficient risk mitigation to be? How does this translate into how they perceive strengths and weaknesses, and how they justify their decisions? How does the evaluator define strengths and weaknesses? Is it a strength to meet an RFP requirement, or does it have to be something more? Will risk mitigation count as a strength? What about quality improvements? If you have qualifications that exceed the requirements, will they consider that a strength? If the evaluation is conducted based on an assessment of strengths and weaknesses this becomes critical customer awareness to have. Does the evaluator see your proposal as more work or as an opportunity? Is it the means to get something they want or is it just part of their job? Are they thoughtfully considering or just processing your proposal? What can you do in your proposal presentation to change that? If you were the one to evaluate this procurement and you were doing it formally according to the RFP, how would you set up your evaluation forms? How would the forms relate to the instructions and evaluation criteria? Is your proposal optimized to evaluate well when those forms are completed? Will the evaluator start the proposal review with the RFP and look for how the requirements are addressed in the proposal? Or will they start from the proposal and then try to match it up to the RFP? This can change how you present your proposal. If you were the evaluator, and you gave the vendors instructions in the RFP and vendors didn’t follow them, how might you react? How would you assess whether vendors followed the instructions? Would you still be able to perform your evaluation? How hard would you try to evaluate the proposals that didn't follow the instructions? If you are evaluating proposals based on the RFP, but the headings and terminology of the proposals are all different, would that confuse you? How hard do you try to find things that don't match, even if they mean the same thing? If you are keyword searching based on the RFP can you find what you are looking for? If you are the evaluator, how hard will it be to compare the proposals you’ve received from different vendors? Do you compare them to each other, or to the RFP? How should all this impact the way you write and present your proposal? How does it impact the outline, headings, layout, text, and graphics? How does it relate to your strategies for winning? How does it impact the way you articulate things in writing? If your proposal is full of what you want to say, then no matter what great things you have to say about yourself, you may be at a competitive disadvantage to a proposal prepared in a way that matches how the evaluators think. When you sit down to write a great proposal, you must take the information you have to share and present it from the customer’s perspective. Answering the questions above can help you discover your customer’s perspective and present things in the way they need to see them in order to accept your proposal. If you were the evaluator and picking an important vendor, wouldn't you look for someone who thinks the same way you do?
  3. monthly_2025_08/HowtoreadaGovernmentRFP.mp4.361f882a5c9356716fdbb147e705f6c2.mp4
  4. Some organizations seek change and some resist it. Both need to plan and manage change whenever possible. Sometimes change management is addressed explicitly, and sometimes change management techniques are incorporated as part of a project rollout. Contractors who carefully plan and of changes can appear to offor greater value, develop more trust, demonstrate insight, and prove that they are a considerate partner. Change management considerations What is the nature of the change? Why change? Why change now? Is the change a familiar change or an extreme change? Is the impact of the change expected to be minor or major? Can the changes be rolled out in a steady, predictable manner? How much repetition will be necessary for employees to get acclimated to the change? How can you ensure there is a clear understanding of the reasons the change is being implemented? What are the key aspects that you will focus on to ease the transition? How can you involve stakeholders in creating the change so as to help them respond more favorably? How do the changes relate to the organization’s ability to fulfill its mission and goals? Who will be impacted the most by the changes? How can you parcel out complex ideas in such a way that stakeholders can understand and absorb them? How can you diversify the way you package new ideas to encourage stakeholders to pay close attention? How can a sense of stability and a vision be instilled as the change is enacted? How can you incorporate cultural diagnostics to identify conflicts and define factors that can recognize sources of resistance? What are the training needs that are driven by the change? When and how do you plan to implement training procedures during the rollout of the changes? How can you create accountability for stakeholders? What short-term targets can be set to aid in the transition? How can you prioritize change initiatives to avoid change fatigue? Are the resources allocated for implementing the changes sufficient for completion? What risks do you anticipate related to the changes and how do you intend to mitigate them? Will there be any overlap between the current state and the changed state? Is it possible to roll back the changes once implemented? How will you monitor status and progress during the rollout of changes? How will status and progress be communicated to stakeholders? How much notice or preparation time will be given to stakeholders prior to changes? How will resource requirements change before, during, and after the changes? How long will it take to realize the anticipated benefits of the change?
  5. If you don’t sell a commodity, starting unprepared to win at RFP release is an organizational failure. You’ve missed your chance to develop an information advantage and are about to spend time and money on a proposal that you (by definition) aren't prepared to win. It is an organizational failure because it has failed at resourcing, planning, prioritization, and ROI calculation. Now that I’ve said that, let’s talk about starting at RFP release and winning anyway. Because it happens. If you want to achieve a high win rate, you won’t let it happen. Moving from a 20% win rate to a 30% win rate increases your revenue by 50% with the same number of leads. You won't increase your win rate by bidding more unprepared bids. But for many reasons, sometimes you find yourself bidding even though you're unprepared. Sometimes it even makes sense to do so. Maybe. Just maybe. Winning at RFP release can be like playing a game. How do you get the top score? Can you do a better job of executing a high scoring proposal than other companies who are starting with an information advantage and better prepared? The answer is probably not. But maybe. Just maybe... Achieving the maximum score without any real customer insight involves optimizing the proposal around the evaluation criteria and process. It’s about organizing your proposal and using the right words to score well. It’s easy to fall into the "say anything to win" trap, because words are all you’ve got to work with when you lack insight. It helps to understand the specific customer’s evaluation process. If the customer is scoring you based on “strengths,” you really need to know what they consider a “strength.” Is it a feature or qualification that fulfills an RFP requirement? Or is it something that goes beyond the requirements? Is it just something that appeals to them? Does it have to be a differentiator? Different customers define the term differently. But if you’re starting at RFP release, you probably don’t know. You have to guess. You may also have to resort to word games when designing your offering. Designing your offering by writing about it is bad engineering and can be a terrible mistake. When designing your offering, you are going to have to consider trade-offs. If you don’t know what the customer’s preferences are, you’re just going to have to guess. Or you may have to weasel-word it. You might have to make the wording a little fuzzy to avoid committing so you can offer the customer both sides of a trade-off while appearing to be flexible and willing to partner with them. It's bad ethics and can set your company up to ruin its past performance, but if the customer likes your positioning more than they dislike your lack of insight and detail it might work. Not consistently or reliably. Not as a standard practice. But maybe. Maybe just this once... Taking a chance at winning vs. taking a chance at submitting In proposals, it is better to take risks and be decisive than it is to try to be everything to everybody. Your win rate will come down to how good your guesses are. A proposal that is everything to everybody will score below a proposal that is exceptional and insightful every time. It is better to risk losing by being exceptional and wrong than it is to be merely satisfactory and acceptable. Satisfactory proposals only win when they are the only one submitted or their price is lower. As a standard practice, it is a recipe for declining profit margins. It is better to risk losing by being exceptional. The odds of winning may even be higher because you offer something special. If you guess what they want well you can appear insightful. And if you guess wrong, well the odds were against you winning anyway. And while it's rare, sometimes a better prepared competitor will produce a substandard proposal, even though they had the insight to create an extraordinary one. They have to get their insight on paper. If they don't you have a chance to appear more insightful, whether you really are or not. If you find yourself starting at RFP release, seek differentiators that correspond to the evaluation criteria and an offering design that differentiates by taking risks to be categorically better than simply meeting the RFP requirements. You can't beat a better prepared competitor by trying to be a little better than the unprepared company you really are. Take risks to offer substance instead of weasel-wording. Take a stand and offer something great without worrying whether they'll like it or not. Guessing at how to be great has better odds of winning than reliably being ordinary. Be the stranger that people want to meet If you are starting at RFP release, the customer probably doesn’t know or trust you. If you are meek, compliant, and ordinary you can only win on price. But if your proposal inspires the customer with aspirations that turn out to match theirs, while also being compliant, qualified, competent, and substantiated, they might just pick you over someone else they already know who submitted an ordinary proposal. Make sure your exception doesn't become a routine A word-game playing weasel guessing at who they should become is not the company you want to be. But it could be what you have to do to win today. Who will you be tomorrow?
  6. Normally I think that even looking at a past proposal is asking for trouble. You don’t need that kind of pain. You made mistakes you don’t even know about. A lot of them. In fact, based on what we see when we review proposals for companies, there were a lot of problems in them. Even the proposals that won. Why open those wounds? Two words: win rate. A small increase in win rate is worth a large investment. Do the math. But what should you do to improve your win rate? Looking at your proposals to see what’s actually getting to the customer and assessing what that says about the organization that created it is a good place to start. It’s objective. The customer does not see your intent. Only what you submit. So here are 11 examples of the kinds of things we learn when we look at previous proposals for companies: Do your writers focus on the RFP or their own ideas? Do they show insight that goes beyond the RFP? Do your proposals effectively present your differentiators? Are they writing around the fact that they didn’t have the input they needed? Do they have bad writing habits? Does the organization show they were writing according to a plan or were they making it up as they went along? Is writing from the customer’s perspective institutionalized? Is writing to optimize the evaluation score institutionalized? Were reviews effective? Did the team run out of time? Is visual communication an afterthought? And for each item above, here is where you can invest to improve your win rate: Focus on RFP compliance first, but don’t stop there. Insight beyond the RFP is an important quality indicator, since it indicates whether your proposals demonstrate an information advantage. If you want your proposals to show insight, you have to supply that insight to your writers. In order to be the best, you must be different in all the right ways. Differentiators matter. They must be a priority. Yet they are often an afterthought. Instead, identifying and articulating them should be the focus of the entire pursuit. You can’t write from the customer’s perspective if you don’t know what that is. Fluff and unsubstantiated claims do more harm than good. You must gather the right data, assess it effectively, and deliver it to the proposal writers at the right and in the right format for it to impact the proposal. Does your organization tend to write in passive voice? Does it tend to make unsubstantiated claims? Does it avoid the issues? Does it speak with the right level of formality or informality? Should any bad habits be fixed by training proposal participants or through back-end editing? Are your reviewers trained to catch bad habits when they reappear? Implement an effective proposal content planning methodology. It helps to integrate it with developing the quality criteria for validating the quality of your proposals. If your proposals describe your company and your offering, you’ve got a problem. Your proposals are probably written from your perspective instead of the customer’s. This is a major training issue. It even has the potential to impact your corporate culture, because once people start considering the customer’s perspective during proposal writing, it tends to spread. And that’s a good thing. If the customer follows a formal evaluation process, then your proposal is more likely to be scored than read. That means writing your proposal to maximize its score can be more important than maximizing it to read well. This requires training proposal contributors to organize and write your proposals according to a different set of priorities. It also requires understanding your customer’s evaluation process in detail. If problems are slipping past your reviewers, you’ve got process and/or training problems. Ineffective reviews can also introduce problems, and sometimes that’s the explanation for problems you see in the proposal. An ineffective review process can be worse than not having a review at all. If you don’t have a written definition of proposal and criteria for use in performing the reviews, there’s a good chance your review process needs improvement. Changing the review process often means retraining the highly experience senior staff who often participate in the reviews. If your proposal team ran out of time on a bid it can be understandable. If they run out of time on a significant portion of your bids, your win rate is likely suffering and it’s probably worth improving your process and/or resource allocation in order to improve. Is the message built around the graphics, or were the graphics tacked on at the end? Graphics can be tremendously effective, but only if they are effectively built into the proposal. Proposal Content Planning and Proposal Quality Validation. They are two of the core methodologies for the MustWin Process that is documented within PropLIBRARY. If you are interested in having an experienced person outside your company review your proposals, here is how to contact us.
  7. We're turning PropLIBRARY into a platform for corporate-wide continuous win rate improvement. We're starting by adding three days' worth of online training to our single user subscriptions and creating a new Advanced Subscription that will have a week's worth of online training. Guess what we're doing for our Corporate Subscriptions... Currently a Here's a list of topics that training includes. One of our goals is to provide a cost-effective way to train everyone in your organization. But we're doing it in a way that is customizable Do you have business-line considerations that should be part of a bid/no bid determination? Special charge code procedures? Specific people to notify when RFPs are received? They can all be incorporated into the training so that your staff learn everything they need to know the way they need to know it. But because it's online training, when things change you can easily update the training. If you conduct a lessons learned review and discover that something slipped through a review, you can update your reviewer training to improve things in the future. This transforms PropLIBRARY into a platform for organizational change --- true continuous improvement of your win rate. And because we have an efficient content management system as a baseline to work from, we can create customized online training and roll it out to everyone who needs it at a cost that's an order of magnitude less than it has traditionally cost. Let's talk prices and return on investment A PropLIBRARY Corporate Subscription currently costs $3,000/yr. That price is going up to $5,000/yr when we release our Advance Subscription training. If you purchase a Corporate Subscription before then, you save $2,000/yr. Customization costs can vary, but we think customization for 80% of companies will cost below $15,000. The net is that a week's worth of customized business development and proposal training for 100 people can cost only $180 a person. What will the impact be on your win rate of that level of training continuously updated and provided to all your staff? We're creating the platform you need to revolutionize the way your organization wins business. We're creating something that will give you a precious competitive advantage against other bidders. Ordering is easy and electronic You can place a Corporate Subscription in your cart and checkout using PayPal and a company credit cart. But you don't have to do it that way. We have other options. And you probably want to discuss it first and see a demonstration. We want to show it to you. We're excited about how it's turning out. Click here to grab a timeslot and we'll set up an online meeting. Just do it before we finish releasing everything and raise the prices.
  8. monthly_2025_08/IntroductiontotheMustWinProcessv3.mp4.db6ccaaa5bbb6c21249deb0d2de48f8a.mp4
  9. Getting everyone on the same pageWhen people working on proposals are pulled in multiple directions and all have different goals and different approaches to achieving them, you’re not going to maximize your win rate. Getting everyone on the same page involves more than defining roles and responsibilities, steps in the process, and having good assignment management. It requires having expectation management throughout the process. It requires that all these things be integrated into the process so that they are inherent in the way things work. Premium content for PropLIBRARY Subscribers: Defining roles functionally Roles and responsibilities Developing an information advantage You must be extremely good to compete on price and maintain a decent profit margin. You can’t maximize your win rate by bidding with only the same RFP everyone else has to guide you. To maximize your win rate, you must bid every pursuit with an information advantage, and you must institutionalize how you develop an information advantage if you want to be able to do it consistently. But it takes more than simply fishing for customer, opportunity, and competitive insight. You must seek the information that the proposal writers will need to write the proposal from the customer’s perspective. The reason you need to bring structure to the pre-RFP pursuit is to enable the people gathering intel to anticipate the needs of the proposal writers and bring the insight required to write from the customer’s perspective to the proposal. Premium content for PropLIBRARY Subscribers: How information flows through the process to become what you need to win Introduction to Readiness Reviews Discovering and building your proposal around what it will take to winWhile it’s hard to believe, most bids are submitted by companies that never explicitly asked themselves what it will take to win. If you bring it up, they will never admit not knowing. But you’ll catch them making up an answer on the spot. You can’t build a proposal around what it will take to win unless you can articulate it. While there are similarities, every bid is different because every customer has different preferences and different ways of making decisions. You can’t articulate what it will take to win if you haven’t discovered it. But if you have, you’re not done. You must turn your insight into what it will take to win into criteria that can be used to assess proposal quality. You must use what it will take to win to guide proposal development and measure its quality during reviews. When you do this, you shift your efforts from trying really hard to pull out a win at the tail end of the process to making sure a win is within reach before you even start. The result is that you can confidently ensure that everything you do during proposal development is based on what it will take to win. Anything else is just gambling. Premium content for PropLIBRARY Subscribers: What it will take to win Assessing what it will take to win Creating your "what it will take to win" list 90 things that someone needs to do to win and who is usually responsible Defining proposal quality and enabling writers to achieve itWhile it’s hard to believe, most companies do not have a definition, written or otherwise, of proposal quality. Without it, they default to “I know it when I see it,” which instead of achieving quality proposals, achieves the opposite. This is not only because it is an arbitrary measure, and quality should not be arbitrary, it’s also because it's a strictly a back-end measure. You don’t find out whether you have a quality proposal until after it’s produced. You can’t validate proposal quality or design quality in from the beginning in this environment. The only way to design quality in is to define proposal quality and ensure the writers are working with the same criteria that the reviewers will use at the back end. When you implement a review process that does this, you’ll find that the review to validate the quality criteria becomes more important than the review of the draft. Not only with this one switch will radically change how you do proposals, but it will change your pre-proposal process and corporate culture as well. It will change them for the better and take your win rate to a much higher level. It’s the difference between winning by chance and winning consistently and deliberately. Premium content for PropLIBRARY Subscribers: Defining proposal quality Introduction to Proposal Quality Validation 60 proposal quality considerations for 6 key topics The MustWin Process enables you to achieve all four of these goalsThe MustWin Process was designed to achieve each of these goals, while guiding you through all of the activities required to win your pursuits. It makes it easier to get everyone on the same page with the same expectations, while creating an unbroken flow of information that discovers what it will take to win and builds the proposal around it. The MustWin Process defines proposal quality so that proposal writers can design it into the proposal as well as enabling reviewers to validate proposal quality. The result is that the MustWin Process maximizes your win rate, manages expectations, and improves efficiency while reducing risk.
  10. So here's our plan for world conquest. First we developed software that lets us repurpose our huge content library as online courses, complete with videos, quizzes, exercises, transcripts, and more. Next we're going to develop about a week's work of training. The basic stuff needed to implement our recommendations will be included with a regular PropLIBRARY Subscription. Additional courses that explain how to customize our approaches, adapt them, or use them for organizational development will be part of an Advanced PropLIBRARY Subscription. Now we just have to produce dozens of hours of training. Luckily 80-90% of the material is already done. But that last 10-20% is time consuming, since it involves tasks like video production. A peek inside our production lab Instead of just doing this development in isolation until we're ready to release them all in one big swoop, we're going to give you a peek at our production lab. The list below contains the titles we have planned and their status. It's all subject to change, because when we're developing a course, if we think of a better way to present the material, we're going to take the time to rework our content instead of working to the calendar. Titles may be changed, merged, or dropped. We also might take things out of sequence if it speeds up production. So you can think of these as targets, but not promises. Most of the titles below are an hour in length. A few will be more like one and a half hours. A couple will intentionally be a half-hour or less. The Advanced titles may get skipped until the end. Product Plans But taken as a whole, there's a ton of training coming. It's lot of value that we've chosen to add to our subscriptions instead of making it a separate product. Regular PropLIBRARY Subscriptions will be $495. Advanced PropLIBRARY Subscriptions will be $695. If you are a PropLIBRARY Subscriber the day we launch the Advanced Subscription, you'll get automatically upgraded from Regular to Advanced, saving $200. As each course is completed, we'll go ahead and release it and our current subscribers will get immediate access. In some cases, we might release drafts instead of waiting until the final piece is produced. Let's chat about it If you'd like to chat about what we're up to, create training to be hosted on our platform, or discuss how this is all just the first step to turning PropLIBRARY into an amazing corporate win rate improvement tool, you can get on our calendar here. January Introduction to the MustWin Process. A quick orientation. We're working on this one right now. How the MustWin Process works. A more detailed roadmap. This one is next. Proposal startup and logistics. While first in sequence, this topic may get produced last in this group. Kickoff meeting options (Advanced) Creating a compliance matrix. This is where we'll start since it's more important and useful to more people and we want a complete set of guidance for planning the content of your proposal to be our first major release. Advanced compliance matrices (Advanced) Creating a proposal outline Introduction to Proposal Content Planning Implementation options (Advanced) Guiding writers (Advanced) What should go into your Content Plans? Boilerplate and re-use (Advanced) Bid strategies and proposal themes February Functional Roles and Responsibilities Resource allocation and staffing (Advanced) Introduction to the proposal writing process How to write for proposals What to do when you get a proposal assignment Proposal writing from the customer's perspective March Introduction to Proposal Quality Validation Defining Proposal Quality Implementing Proposal Quality Validation (Advanced) Performing Proposal Quality Validation Proposal sight reading Proposal completion Post submission April Pre-RFP pursuit Pre-RFP process design (Advanced) Performing Readiness Reviews Customizing Readiness Reviews (Advanced) Discovering what it will take to win Offering design Bid/No bid decisions
  11. Thanks for another year of double-digit growth. We had over half a million visitors in 2016 and served over a million page views, with a 25% growth in new users. Visitors from social media about doubled. Thank you for all the social media reposts and referrals that made that growth happen. And another thanks for the many fascinating discussions it made possible and the inspiration it provoked. In case you missed any of them, here are the 10 most popular items we published last year: Proposal recipes for inspiration Figuring out what to say in your proposals 8 things great proposal writers do differently Proposal management and organizational development How to win in writing Introduction to The MustWin Process Attention Executives: How do you know if someone is ready to be a proposal manager? This bidding strategy can destroy your company before you realize it Why we don’t recommend color team proposal reviews 44 things you might want from your Proposal Manager In 2017 we will be rolling out new online courses and content faster than ever!
  12. Designing quality into a proposal means building a process in which defects aren't created. This is far superior to a process that allows defects and then tries to find and fix them. Designing quality in doesn't just mean making sure you get it right the first time, it means eliminating the possibility of defects. It is a very different approach from creating a draft and then inspecting it. Most existing proposal processes are based on creating a draft, then inspecting it, then re-doing it as needed. Instead of a proposal designed to win, this model results in fixes applied to fixes that only end when the deadline is reached. This kind of approach encourages people to just write something, see how it works out, and if it doesn't just keep writing things. The entire color team model is based on doing just this. The proposal process that most companies follow is deeply flawed because it is based on a flawed concept of quality. Creating a proposal process that designs quality in means following a different process and having very different expectations. First, we have to address what a proposal defect is If we define proposal quality as reflecting what it will take to win, a defect is something that runs counter to what it will take to win. The magnitude of the defect is directly proportional to how it impacts what it will take to win. Thus, while you might consider a single typo to be a defect it is, in all likelihood, an inconsequential one. However, failing to comply with the RFP requirements, ignoring the evaluation criteria, or scoping the offering wrong and blowing the price are defects that can ensure a loss. To be a defect, it has to lower your win probability. To keep this from being purely subjective, you must define what it will take to win in order to assess the severity of a potential defect. For proposals, designing quality in means building a process that ensures that proposals reflect what it will take to win. You should note that this does not ensure a win. That is not up to us, it is up to the customer, and it occurs in an unpredictable future. What you can do as a company is define what you think it will take to win, and confidently deliver that. Designing quality into a proposal starts with how you design your offering Building a proposal around what it will take to win and doing that without defects means you need to know what you want to offer before you start writing. Offering design is an engineering process, and designing your offering should not be done by writing about it. You may select any engineering design methodology that fits the nature of what you offer. The goal of offering design is to ensure that what you intend to offer fulfills what it will take to win before you create a narrative draft of the proposal based on it. You should validate your offering design through whatever testing, reviews, approvals, or stakeholder participation is necessary. This does not mean that you read a description and decide whether it sounds good. It means that you determine that the design is sound, compliant, competitive, and what the company wants to offer. Then you can describe it and determine how to be present it. As an example, an offering design that is too expensive to win, non-compliant, or doesn’t reflect the evaluation criteria would fail this validation and should never make it into the proposal. Zero time should be spent writing about it. Before you can validate your offering design, you need to be able to articulate what it will take to win in the form of proposal quality criteria. To perform the validation, you need to establish that each criterion related to what it will take to win has been fulfilled. The good news is that much of this can be checklist simple. Thinking things through instead of writing and re-writing While the offering is being designed, you should also be designing the proposal, with a separate plan for the proposal content that identifies what you need to say and how you need to say it. This plan should address the structure of the proposal and how you should present the details of your offering. It should explain to proposal writers how to position things against the evaluation criteria, competitive environment, and what you know about the customer. It should explain what points need to be made and how to incorporate your bid strategies and differentiators. It should explain what proof points you will use to make the conclusions you want the customer to reach credible. The Proposal Content Plan should also be approved before using it to draft the proposal narrative. Designing quality into the proposal means having thought through the offering and its presentation before you create a narrative draft. This is the only way to create a draft that is without defects, and fully incorporates what it will take to win. Indecision works against your ability to design quality in During proposal creation you will face many decisions that need to be made. To design quality in, decisions must be made so that they can be validated or approved. Designing quality in requires a series of validations and approvals with the key stakeholders and decision makers instead of waiting for milestones to have subjective reviews. Creating a draft for someone to inspect and see what they find means you’ve fallen back on the old model and are not designing quality in from the beginning. The problem with the old model is that when the review and validation of decision waits for a draft, changing the decision requires an entire draft production cycle. This is not only time consuming against a deadline, but it tends to result in the next draft being a kludge instead of something created with the final decision in mind. Designing quality in means making sure you have made the right decision, and then writing it into the proposal. Poor corporate decision making is the real reason companies stick with the old model of inspection and repair. They can’t decide, so they wait and see how the draft looks. A better approach is to improve your organization's ability to make decisions so that quality can be designed in. If you don’t do this, you are unlikely to be competitive against an organization that does. Don't fall back on old habits When you design your offering separately from writing about it, design your proposal content before you create narrative, and validate decisions as you go along, you have the right foundation for designing quality into your proposals. If you catch yourself correcting the message in writing, then you failed to think it through before writing started and have fallen back on the old model of writing and correcting. The new model is not based on correction at all. It is based on collaboration. Proposal writing, decisions, and validation are all one collaborative process. Reviewers are part of this collaboration and not occasional drive by correction police. You know the new process is working when you hold reviews, because quality control is needed, and your reviewers are focusing on editing and wording because there aren't any issues to report. You know it is working because you are not relying on reviews to "fix" your proposals. Don’t fear running out of time Regarding time management, note that designing quality in takes less time than preparation, inspection, and rework. The types and amount of information that need to be gathered to win a proposal are the same with either model. The time it takes to make decisions is the same with either model. The amount of review time is the same. However, the amount of work product created and wasted effort is less when you design quality in. In addition, there is less risk and likelihood of production costs spiraling out of control. There is less risk of something slipping by, and even less risk that it might hurt your win probability if it does. The most important consideration is the impact on your probability of winning. Which would you rather submit, a proposal that was designed to win, or a proposal that consists of fixes applied to fixes until the clock ran out? Designing quality in through continuous validation enables much better time management than having one or more back-end reviews. Better time management leads to a better proposal, higher win rate, and being more competitive than companies that follow the old model of write and fix.  
  13. The following lists of proposal quality considerations have many uses. They can be used to replace review teams. They can be used to enhance review teams. They can be used as checklists by writers and reviewers. They can be used to define proposal quality. The checklists below are quality assurance checklists and not procedure checklists. Instead of telling you what to do, they ask whether you have achieved what you should have. They are intended to be used to assess whether what is created reflects what it will take to win. Proposal developers who want to win will use them as checklists to build quality in, rather than waiting and only using them to correct things later. For each aspect of the proposal, go line by line and consider each item on that list. If you just consider the items collectively across the section or proposal, you will miss opportunities to strengthen the proposal and increase its competitiveness. Quality considerations related to the offering design Was a methodology used to develop and validate the offering design, or was the offering design developed by writing about it? Is the offering design compliant with all RFP specifications? Are sufficient competitive advantages built into the offering design? Is it clear why any trade-offs made during offering design were chosen? Is the rationale for any teaming arrangements and how the customer will benefit from them clear? Have all assumptions that were made during offering design been accounted for? Is the offering design price competitive? How do we know? Does each aspect of the offering design address: who, what, where, how, when, and why? Does the offering design anticipate and provide the information proposal writers will need? Quality considerations related to bid strategies Is it clear what differentiates the offering, and is that compelling? Does the proposal explain how the customer benefits from those differentiators? Is it clear how we position against the competitive environment, evaluation criteria, customer concerns, technical considerations, management considerations, pricing, etc. Does the offering explain what our strengths are, why they matter, and how the customer will benefit from them? Does it say what the decision makers need to hear to decide in our favor? Does it take into consideration the customer evaluation criteria and procedures to maximize our score? Quality considerations at RFP release Do the contents of the RFP match what was expected? Do any plans need to change? Does the RFP contain any requirements that we can’t comply with? Does the RFP contain any requirements that are show-stoppers (including all performance standards, service levels, contract terms, and other specifications) that prevent you from wanting to bid? Have a schedule, assignment list, review plan, production plan, and content plan been prepared? What approach, schedule, agenda, and participation are required for the Kickoff Meeting? Have all required resources been identified and allocated? Are they available and capable of fulfilling their assignments? Is the schedule the best allocation of time, and are deadlines realistic and enforceable? Quality considerations related to Proposal Content Planning Have we created an RFP compliance matrix? Has the compliance matrix been verified to ensure it properly accounts for all requirements? Does the proposal outline incorporate everything that the compliance matrix indicates it should? Does the wording of all headings track to the wording of the RFP? Will the proposal be organized in a way that is easy for the customer to evaluate? Does the content plan reflect the correct RFP requirements, per the compliance matrix? Does the content plan provide instructions for the writers to optimize the score against the evaluation criteria? Does the content plan provide instructions for addressing our bid strategies (see the list of bid strategy considerations)? Does the content plan provide instructions for the writers regarding how to incorporate any customer, opportunity, or competitive intelligence we’ve gathered? Does the content plan provide instructions for writing about the offering? Does the content plan provide instructions for where and how to incorporate project citations, references, or data that can enhance our response? Does the content plan provide instructions for communicating visually, including the use of graphics, illustrations, tables, relevance boxes, pull quotes, examples, screen shots, etc.? Does the content plan provide instructions for handling any assumptions, limits, boundaries, or issues that must be resolved? Does the content plan provide instructions for using any boilerplate or re-use material that might be relevant, noting any deviations or corrections required? Are the content plan instructions sufficient to guide writers in creating the winning proposal? Proposal Writer’s Checklist Does the draft explain what matters about our experience, capabilities, qualifications, and offering? Does every sentence in every paragraph make a point that matters? Does the draft say anything that we think matters that the customer won’t? Can anything be deleted from the draft without impacting the evaluation? Is the draft written from the customer’s perspective instead of simply describing us and our offering? Does the draft reflect our full awareness of the customer? Does the draft show insight beyond what was in the RFP? Does the draft implement the correct bid strategies (see the list of bid strategy considerations)? Will the draft answer the questions we anticipate the customer might have? Will the draft give the customer the information they need to reach a decision? Does the draft motivate the customer to take the desired action? Does the draft complement our pricing strategies? Does the draft explain why our approach to the trade-offs is superior? Does the draft communicate visually? Does the draft follow the RFP instructions and maximize the potential score against the evaluation criteria? Does the draft follow the instructions of the proposal Content Plan? Production and Submission Have any modifications/changes to the RFP been accounted for and addressed? Has anything changed during final changes or editing that might make the proposal non-compliant with the RFP? Does the proposal comply with all formatting instructions included in the RFP? Does the proposal include all documents, forms, attachments, and parts required by the RFP? Is the proposal free of errors? Is the proposal properly packaged and labelled? Does the proposal delivery plan mitigate all possible risks?
  14. Warning: This article challenges things people may have learned about proposals when they got started. It's good to learn better ways of doing things. But I find it amazing how much resistance there is to change in this area and how much it holds improving proposal quality back. So forgive me if I'm a little too honest... With the thousands of subscribers and hundreds of companies I’ve worked with, I’ve never seen one use the color team model to achieve consistently effective proposal reviews. Color team reviews are not even close to being the only way to achieve proposal quality. Frequently color team reviews cause more problems than they solve. Occasionally, we’ve seen companies do what amounts to inventing their own proposal review methodology, giving it color team labels, and then claiming they follow the color team methodology. But we discourage even doing that, since it carries forward some of the flaws in the color team model. This doesn't mean that you shouldn't have proposal reviews The primary justification that people give for continuing to follow the color team model is that it’s better than nothing. But the alternative to color teams is not doing nothing. The alternative is a methodology that does a better job of validating the specific things that define proposal quality. The color team review model was a great first attempt at quality control for proposals, but we now have much better methodologies for quality assurance. In fact, proposals are one of the last areas where people ignore what we've all (should have) learned from quality methodologies such as ISO, Six Sigma, and the others. Today, color team proposal reviews are mostly ineffective, obsolete, and not worth preserving. They linger on only as a bad habit, a product of resistance to changing terminology that lowers win rates. The color team review model neither defines proposal quality nor validates its specific attributes. Color team reviews easily degrade into subjective opinion fests that provide more confirmation bias than changes to win probability. They don't even deliver the intended benefit of unbiased opinion. As a result, color team proposal reviews do not deliver quality assurance. Color team reviews are not a best practice and it's a sign of process immaturity when companies claim them as their methodology, even if it was taught to you in a class. They stand in the way of best practice. Color team reviews are actually a bad habit that you need to change. If you’re still clinging to color team proposal reviews, here are some of the defects that lead to these conclusions. The scope of color team proposal reviews is not well defined Color team labels mean so many things to different people that they have become meaningless. Every proposal reviewer brings a different preconception about the scope of the review. When you use color team labels, no one really knows what you’re talking about except in the broadest sense. Each person has a different concept of the scope and is certain theirs is the right one. How are you supposed to validate the specific attributes of proposal quality if everyone defines them differently? Most color team reviews are fishing expeditions where reviewers see what they can find. Can they find a defect? Can they find an opportunity for improvement? Reviews like this do not produce results on purpose ― they produce them by chance. How often do the opinions of reviewers conflict with each other? How often do they focus on a minor detail instead of providing the validation the proposal team needs? Take the popular Red Team review as an example. Most Red Teams try to review: capture strategies, the proposal outline, RFP compliance, accuracy, effectiveness of the approach, persuasiveness of the writing, completeness of the document, production quality, how you stack up against the evaluation criteria, implementation of win themes, and incorporation of customer, solution, and competitive awareness. Is it any wonder that they run out of time and never fulfill their goals? Or that you can’t get participants to focus? Or prepare ahead of time? Or that they always degrade into subjective preferences when reviewers run out of time, but need to deliver feedback without admitting they didn't finish? What does it even mean to "review" a proposal? "Read this and tell me what you think," is not a quality methodology. It's a second opinion. That's not bad to have, but it is not a quality methodology. Having an open-ended, unscoped review of undefined quality attributes is never going to produce proposal quality. Why would you even attempt to set up a proposal review process up that way? Setting yourself up for failure The problem is not the people performing the reviews. The problem is the defects in the model they are trying to implement. A proposal review model should define the scope of what is to be reviewed, the quality criteria you will assess by, and provide reliable methods for validation. The color team proposal review model doesn’t do what we really need for proposal quality assurance. How often does the color team model end so badly that it requires another review to attempt to make up for the failure? This is not just a failure in completing the writing, it's a failure to define what the writing should be until an opinion is handed down (too late) by the reviewers. It is a failure in the color team model because it provides the definition of proposal quality after the writing. The color team model is what gives everyone working on proposals the terrible impression that we should rush to draft so that we can read the proposal and then figure out what it should have been. There aren't enough colors How often do color team review failures result in simply adding another review? And another? And so on until they end up submitting what they have, thrown together in a rush at the deadline? You can't fix the color team model by adding more colors. There aren’t enough colors and sit-around-a-table reviews are not the best way to validate every quality criterion. Some companies have Blue Teams, Pink Teams, Red Teams, Green Teams, Purple Teams, Gold Teams, White Teams and occasionally other colors. Is a Pink Team review an outline review, win theme review, capture strategy review, storyboard review, progress review, draft review, production plan review, or all of the above? Is a Red Team review an early draft, a complete draft review, or the best draft we can manage to create in time review? Is the Blue Team a progress review, lead qualification review, bid/no bid review, win strategy review, or an assessment of readiness for RFP release? I've seen these reviews conducted every which way. Just never effectively. Companies define the colors according to the convenience of the moment. In practice, color teams are a subjective exercise that does not fulfill the very real need for quality validation. Color reviews get in the way of proposal quality Color reviews don't improve the proposal. The color model gets in the way of defining what the proposal should become and provides no way for the writers to self-assess whether they had fulfilled their assignments correctly. The color team model gets in the way of designing quality into the proposal by failing to get the writers and reviewers on the same page regarding how to define and measure proposal quality before proposal writing even starts. The reason companies don't plan before they write and then write to the plan is the color team model. You can't fix the defects in the color team model with better techniques Sometimes color team reviews are advocated by someone with a particular technique they are convinced "works," but in reality only works when they implement it. A process that depends on a particular person to implement it is not a process. A process that can’t be implemented the same way twice is just a way of doing things. No matter how skilled, a review team lead is not a process, either. You need a proposal quality process and not a culture of personal dependency. A side effect of continuing this madness is that it ensures that nearly all reviewers come into a color team review with the wrong expectations regarding what to look for and what to do about it. How is this supposed to achieve quality? Getting everyone to have the same expectations requires change Even if you could fix color teams, you’d still have to retrain everybody before every proposal. If your staff have been taught the color team model, you can't count on their consistent performance, no matter how skilled or well-meaning they are. They may have good eyes, but the process is irremediably flawed. And in the future, everyone who hasn't been retrained will show up trying to do things the flawed way. One of the benefits of not using color team labels is that it forces recognition that expectations have changed. There really is no other path forward. It’s better to design quality in than to apply corrective patches against a deadline Checking in from time to time to see if the proposal is broken so that you can fix it is one way of catching defects that should have never been made. It’s good to inspect to catch errors and you will always need to check your work. But it’s better to prevent errors in the first place. It’s even better to design out the possibility of errors. The color team model provides no way to do this. But that doesn't mean it can't be done. If you care about the quality of your proposals and realize what improving your win rate can do for your company's bottom line, this is very much worth looking into. The color team model does not design quality into the proposal on the front end. With collaboration vs. correction and collaboration vs. objectivity. One of the supposed benefits of color team reviews is that participants are supposed to be independent. Being separate from the proposal team, they can look at the proposal with fresh eyes. However, looking at the proposal with fresh eyes when it is closer to the deadline than to the RFP release date is a bad way to try to improve quality. That's a huge source of last-minute radical course changes. Those course changes should have come much, much sooner. A quality methodology should lower risk and not raise it. Objectivity is most relevant to validating decisions made regarding offering design and win strategies. Both should be validated very early in the proposal lifecycle and not late in the game. The after-the-fact color team model way of defining quality is terrible for making sure you are proposing the right offering. A different approach is required to validating that what you intend to offer is what will result in a winning proposal. Win strategies are often assessed during blue team, pink team, and red team reviews. More colors is not better. More subjective opinions is not better. A better approach would be to define what it will take to win as quality criteria and to measure your win strategies against those criteria. The expertise and experience of color team participants can sometimes be more beneficial if used at the beginning, instead of much later. But then those participants are no longer objective when the review comes. Other good things can outweigh the value of objectivity in late-stage reviews. Many organizations are too small to achieve real objectivity anyway. Those organizations are better served by models that are more collaborative, and involve validating decisions as they are made instead of late-cycle corrective meetings. The issue isn’t really objectivity, it’s quality. What is the best way to achieve quality in your organization? Is it through back-end corrections or front-end validation? Does the color team model deliver what is needed to validate that your proposal is built around what it will take to win, or does it deliver subjectivity and last-minute surprises? There is no good time to have a Red Team You can have your Red Team too early, or you can have it too late. If you have it too early, you are asking people to review a document that is incomplete and different from what the customer will see. If you do it too late, the document will be more mature, but you will be out of time to make any changes. If you only have one review, it can be worse than having no reviews at all. This is why people add “pink team” reviews, or have follow-up Redder-Than-Red Team reviews. Adding colors will not solve the problems inherent in the model. Needing to add colors isn’t proof that you did the proposal wrong. It's proof that the color model has defects leading you to late-stage corrections. Experienced opinions are still just opinions The color team model tends to degrade to simply gathering the most experienced people available and getting their opinions. However, these people are rarely available. And when they are, they usually can’t dedicate the time that a good review requires. They don’t even come close to fully validating all of the criteria necessary to establish proposal quality. There isn’t enough time. Instead of asking them to review everything about the proposal, their experience should be focused on validating the decisions that produce the resulting proposal. It is not realistic to expect senior staff to be available to participate in an unlimited scoped review of every proposal a company produces. When you don’t have a definition of proposal quality or criteria to validate, this is the best you can do. But when you define proposal quality and identify quality criteria, you can involve additional staff, validate most criteria without it being a major production, and achieve better proposals as a result. Color team reviews usually lack leadership and accountability Color team reviews rarely address who oversees the reviews, holds reviewers accountable, calls them to order, instructs them in their mission, and teaches them how to do their job. Usually it defaults to the Proposal Manager to direct the Red Team, consisting of staff who are much more senior and difficult to direct. All while trying to produce the proposal. Separate from questionable quality practices, does this sound like a good way to manage an important corporate function? Leadership by default is not a best practice. Every step in an effective workflow must have oversight, accountability, guidance, and training. This is the role of a leader. To be effective, every review must have one. Better than nothing is not good enough The primary justification for the ineffective color team model is that it is better than nothing. This is actually debatable. A strong case can be made that the color team model does more harm than good by getting in the way of planning before you write, encouraging unlimited draft cycles, creating a culture where proposal quality is subjective, and fixing proposals on the back end instead of designing quality in from the beginning. Instead of looking at it as if the color team model is better than nothing, isn't it better to look at it from the perspective of what people need to achieve a quality proposal and to measure the results by how it impacts your win rate? One more defect of the color team model is that it prevents accountability. There is no way to account for what impacts your win rate when you have subjective, unscoped reviews. This leads to making decisions about how to win based on more opinions. The fulfillment of quality criteria can be quantified. When you look at the need to validate quality, you should be asking how quality is defined and how people will know it when they see it (other than by opinion). If you do not do this, your competitors will. Sticking with an approach that is merely better than nothing dooms your company to declining win rates. The color team model is not competitive against more professional proposal review processes. An improved win rate is worth the investment in change. P.S.: If you read the above and think “That’s because they are doing it wrong. If they only did it the way I do mine it would work so much better,” you need to review the part about how a process that’s dependent on an individual person isn’t really a process. If you start by saying "I do things better and it works," then you need to reread the part about how the only way people are successful is when they create their own methodology and give it color team labels. You haven’t fixed the defects inherent in the model, you’ve just invented your own. Congratulations!
  15. Companies have different issues that impact what is the best approach for achieving proposal quality. For example, if you only have a few people capable of performing quality reviews and you use them on every proposal, the best approach will be different from an organization where the participants on the reviews are different for every proposal. Here are some other considerations: The level of training and expertise in available reviewers. Untrained reviewers, even if experienced, may not be capable of validating proposal quality. Training can be embedded into the methodology. Or the methodology can be simplified to lower the amount of training required. The availability of reviewers. When reviewers simply aren’t available, the methodology needs to feature more self-evaluation. When reviewers have limited availability, there is a tendency to have fewer but larger reviews, lowering their effectiveness. A scalable set of criteria can help ensure that the critical items are adequately covered. Budgets and how you account for the reviewers’ time. Budgets can make something with a great return on investment (ROI) look too expensive to implement. An effective quality methodology increases your win rate significantly. Having proposal quality reviews does not add to the cost, it adds to your ROI. If people are minimizing reviews to save money, you might want to check your math and see if your accounting system is getting in the way of your ROI. The size, scope, complexity, and consistency of your proposals. The effort to achieve proposal quality is based not only on the page count of your proposals, but also on the complexity of the content. What you need to validate in a proposal with a simple structure is different from a proposal with a complex structure. When implementing a quality methodology, consistency is also a factor. Is every proposal a unique creation? Do you always bid a similar offering? How much variation is there in the RFPs? Do you need to validate every decision and unique aspect? Does that change with every proposal? The anticipated award value. How much is it worth to make sure this proposal is the best it can be? Risk and return on investment (ROI). How much quality risk can you tolerate? How much are you willing to risk your investment in the proposal? The importance of win rate. How much will an improvement in your win rate increase your bottom line? What role should your proposal quality methodology play in achieving it? With those considerations in mind, consider these three approaches to proposal quality. Each has different characteristics that make them better suited for some organizations than others. Proposal Quality Validation. A criteria-based approach that focuses on defining proposal quality, turning that into criteria, and then validating that the proposal reflects those criteria. Implementation involves giving the same criteria to the writers that the reviewers will use. The methodology is very efficient. It requires a great deal of thought, but a low level of administrative effort. The basic premise is that how you perform the reviews is far less important than what you review. It requires reviewers to validate against the criteria and not to simply comment on whatever enters their mind during the review. It requires an integrated process that spans the pre-RFP and post-RFP phases, with attention paid to defining proposal quality criteria at the beginning of the proposal. Checklists instead of reviews. In high volume environments, or when trained reviewers are not available, the things reviewers would look for can be rendered as checklists. Checklists can be based on topics like the proposal outline, RFP compliance, bid strategies (differentiators, themes, etc.), offering design, competitive positioning, writing (accuracy, does it reflect the customer’s perspective, etc.), and final production (format, assembly, etc.). Converting quality criteria into checklists works best when there is a great deal of consistency between your proposals, although a hybrid approach is possible, if you provide a place for additions to your standard checklist for pursuit-specific items. It may also be more suitable for low value proposals. Depending on your needs, checklists might require one or more sign-offs to maintain accountability. One interesting feature of this approach is that you can eliminate some or possibly all of your sit-around-a-table and read-it-all reviews. They are not the only way to achieve proposal quality. Design quality in. When the experienced staff who are the only available reviewers are also the key decision makers for the offering, the idea of an objective review team is not feasible. So embrace it. Instead of a break-fix quality approach that uses objective reviewers on the back end to assess whether the proposal meets standards, try designing quality in at the beginning. In this approach, the “reviewers” validate proposal team decisions as you go along. Instead of waiting for the document, they review and approve the decisions regarding the offering, strategies, and approaches to the writing. They review what the proposal will be, as much as they review what it has become. When the proposal is written, the review team no longer needs to consider whether the approaches are correct, but can instead focus on whether they’ve been implemented effectively. It is worth considering the value of objectivity vs the value of designing quality in from the beginning. While both have value, how well you can achieve them depends on your resource availability.
  16. Like color team reviews, I have never seen two companies conduct Black Hat reviews the same way. At a high level, a Black Hat review is a competitive assessment to address who the competition is and their strengths and weaknesses. Sometimes a Black Hat review scores your company and your competition against the anticipated evaluation criteria to determine who has what advantages, and what to do about them. At a minimum, a Black Hat review should help you discover how to best position your company against its competitors, and what differentiators to focus on in your proposal. There are many techniques for acquiring and assessing competitive intelligence. These range from simple SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis to more sophisticated methodologies. The technique you use is less important than your diligence at implementing it. Garbage in, garbage out. However, like risk assessment and quality assurance, if you don’t implement a formal methodology, you won't consistently get good results. Any competitive assessment is only going to be as good as the data it's based on. Gathering good competitive intelligence takes time. If you simply bring all the stakeholders into a meeting and ask them what they know about the competition, you will not get the best data to work with. If you start trying to collect competitive intelligence at RFP release, you will not have the best quality data to work with. You must collect and validate competitive intelligence data throughout the lead qualification and capture phases of the pursuit if you are going to have solid data to assess at a Black Hat review. Getting results The key to a successful Black Hat review is to translate what you discover about your competition into action items. And those action items should not simply be to fill in the holes in what you should already know. A Black Hat review gives you a chance to demonstrate how much you know of your competitors' strengths and weaknesses. But if you don't have enough information to reliably assess their strengths and weaknesses, it gives you a change to do what you need to do to find the information so you can. If you can't even name your likely competitors, you're not ready for a Black Hat review. The action items coming out of a Black Hat review need to affect your capture and proposal strategies in ways that will impact your probability of winning. Otherwise, a Black Hat review becomes an academic exercise that doesn’t affect your chances (this is a fancy way of saying “a waste of time”). How should you change what you intend to offer as a result of your competitors' strengths or weaknesses? What should you say in your proposal to position your company against their strengths or weaknesses? Should you employ ghosting to suggest conclusions the customer might reach about your competitor's supposed strengths or to point out the unmentioned weaknesses? Action items related to teaming should also come out of a Black Hat review. A Black Hat review should tell you which companies are strong where you are weak and therefore make good teaming candidates, because you are more likely to win together than if you remain apart. It can also show you who you might want to take off the street by teaming with them. It may be better to give up a portion of the revenue by teaming with someone than it is to take the risk of losing all the revenue in competition. Or not. A good Black Hat review should help you assess this quantitatively, by showing the effect on evaluation scoring of different teaming scenarios. That’s another reason that formal competitive assessment methodologies can be valuable. They help you look at things objectively by providing the means to rank and score the competition. Black Hat vs Price to Win reviews A Black Hat review is not the same thing as a Price to Win review, but they do have some overlap. A Price to Win review focuses on predicting what the winning price will be. And part of doing this is assessing what prices your competitors will bid. This requires a specialized form of research in order to discover details like a competitors' pricing range, and their likely overhead range based on the data you can find. Competitively assessing price positioning and strategy is potentially relevant to a Black Hat review, and it can inform not only your solutioning but also your proposal writing. Using them to gain an advantage Finally, a Black Hat review should help you finalize your win strategies. Win strategies should be developed in isolation from your competition, when the whole goal of having win strategies is to beat your competitors. It's not enough to simply articulate to the customer why they should select you. You must also be able to say why they should select you, rather than your competition. A Black Hat review can help you formally position yourself against the competition instead of just guessing, the way most people (including your competitors) do. If you do your homework before the review so that you have enough information to assess, a Black Hat review can help you drive winning into your proposal.
  17. For each item, who does it and what comes next? Sometimes it’s not about the step, but how you set up the next one that determines your success. Clarifying who does what, and what each person involved can expect from each other, is as important as having enough people. One company might have lots of small customers. Another may have a few large customers. One company might have lots of different offerings. Another may only have a few. One might offer custom solutions. Another might sell commodities. There is no one way to organize marketing, pre-sales, sales, capture, and proposal development functions. This means you can't assume that everyone knows what each role is supposed to do. And because each is dependent on the others for success, you can't assume that each knows what to give to the others when it's time to hand off. Do you lose information at each step or do you build to the finish? How does that impact your win rate? The Sales Pipeline What are the roles of marketing, pre-sales, sales, and capture in filling your pipeline? Where are the handoffs? What gets handed off? How should the overlaps be managed? Do you have any gaps or lack of clarity? Who finds the leads that fill the pipeline? Is it the sales person or is there a pre-sales function? What gets passed on to who? Who qualifies the leads? And how are leads qualified? What are the standards? Who closes the deal? Does it close with a proposal? Who has responsibility for doing what it takes to close the sale? This can be everything from designing the offering, to staffing the proposal, to overseeing delivery. If a salesperson is responsible for chasing as many leads as possible, does someone else need to be responsible for capturing the lead? If you're offering is complex, you might need specialists involved to design the offering? When, where, and how will they be involved? And if a proposal manager is used to oversee developing the proposal, who is responsible for providing the information needed to write a winning proposal? Is sales still involved during the proposal phase? Who decides when to drop a lead? Who is responsible for bid/no bid decisions? When do they happen? Who owns the customer relationships? If there are handoffs from sales to capture to proposal, then does one person retain ownership of the customer relationship? Or can anyone interact with the potential customer? The approach that works for a company with multiple contracts at each customer might be different from the approach that works for a company with only one contract per customer. Intelligence Gathering Is one person or multiple people responsible for gathering intelligence? What gets delivered for use in the proposal? Who prepares it? Who gathers customer intelligence? What are their intentions and preferences? Who are the decision makers and stakeholders? Who is supposed to find out? Who gathers opportunity intelligence? What are the real size, scope, and requirements? What are the budget and funding issues? Who gathers competitive intelligence? Who currently does business with the customer and who might want to? Offering Design and Teaming What are the roles of sales, capture, operations, and the proposal team in determining what to offer? Who determines what the winning offer should be? Who engineers the solution? Who drafts the specifications? Who selects among the options? Who defines and describes it? Who describes it and how? Who writes about it? Who determines the price to win? What are the price targets? How low can you/should you go? How does this impact the offering? Who identifies potential teaming partners? Can you/should you team and who with? Who provides the justification and who decides? Who is responsible for finding the right companies to team with? Who is responsible for maintaining relationships with potential teaming partners? Who negotiates and decides teaming arrangements? Who decides what roles the team members will play? Who is responsible for their contributions to the proposal? Who will decide the business arrangements? Strategies You can’t write about them if you don’t have them. Who should provide them? Who is responsible for identifying bid strategies? If you get to the proposal and you have no differentiators and no clear bid strategies, should that be the proposal manager’s problem to solve? Who is responsible for being able to articulate why the customer should select you? Is that something that should be considered during lead qualification or pursuit? Who is responsible for knowing what it will take to win? You can’t build a process around what it will take to win, if no one knows the customer, opportunity, and competitive environment well enough. Who is responsible for discovering what you need to know, assessing it, and turning into your bid strategies?
  18. We have some really exciting things planned for 2017. And while it's not yet time for a public announcement, I'd like to discuss them with you so that you know what your options are... If business development or proposal training is part of your plans for 2017, we should talk. If assessing the effectiveness of your business development or proposal groups is part of your plans for 2017, we should talk. If you need to assess the quality of the proposals you will prepare in 2017, we should talk. If you're interested in new ways to roll out improvements across your entire organization in 2017, we need to talk. If you want to simultaneously address win rate, training, quality, process, resource allocation, and return on investment, then we really need to talk. We are launching a huge new training capability. We're also expanding our process tools to enable groups to assess their effectiveness. And upgrades to our corporate subscriptions are going to turn PropLIBRARY into an organizational improvement platform. The new offerings integrate with our current ones, providing tons of options. We're going to solve problems that have gone unsolved for decades, lower the cost of doing things you know you should do but could never afford, and give you ways to increase your competitiveness. If you'd like to know more about what we're up to, click here and schedule a time when we can talk about it.
  19. An effective proposal group is more than just process and tools. But how do you assess its effectiveness? There are techniques and best practices, but what are the issues that impact its effectiveness? Here are five focus areas that reveal when you have things holding you back from maximizing your effectiveness. There are different ways to correct or improve things. But you must know what the issues are before you can select the right solution. Assuming the solution before you really understand the problem leads to continued ineffectiveness and frustration. The first sign of an ineffective process is often frustration. Frustration happens when recurring problems aren’t resolved. The five areas below are common sources of proposal frustration. Handoffs. As soon as your proposal efforts grow beyond one person, there are handoffs. For example, many companies struggle with the handoff from sales to the proposal. When does it take place? What gets handed off? When these questions are not explicitly answered, missed expectations at the handoffs will not only lead to frustration, but lower win rates when people don’t have the information they need to write a winning proposal. If you define the handoffs at the proposal kickoff meeting it’s too late. People won’t have the information they need to hand off. Things will be far less frustrating for everyone involved, if you define the handoffs, including who, what, where, how, when, and why before the sales cycle starts. When people know what will be expected, they’re more likely to show up prepared. And your win rate will rise as a result. Uncertainty regarding roles. Who should do what? If it’s not said, it won’t happen. Instead of dividing tasks up based on who’s available, try defining roles functionally. When one person plays more than one role, they take on all of the responsibilities for all of the roles they take on. And everyone else knows what to expect based on who is playing which roles. In addition to unmet expectations and inefficiency, the big problem with uncertainty regarding proposal roles is delay. It takes longer to get things done and you run out of time for quality assurance (who’s responsible for that anyway?). On their own, staff can usually figure out how to get things done (for better or worse). But one thing they often can’t decide on their own is who must do things. They need clarification regarding the authority to make decisions regarding who must do things. Lack of resources. It makes complete sense that overloaded staff are frustrated. But more staff is not always the right answer. The answer is what will give you the best return on investment (ROI). In the absence of data to establish that, most companies opt for the minimum to get things done. The minimum to get things done will not return the best ROI. To discover the best ROI, you need to collect data that correlates staffing levels with win rate, and collect enough of it to be statistically significant. Another, sometimes easier approach is to establish standards, and measure their compliance and correlation with the win rate. Then you staff at the level that will meet your standards. Depending on the size and volume of your bids, an increase in win rate will usually more than pay for the staff required to achieve it. Once you track your ROI, your corporate culture will change from regarding proposals as a necessary expense to be minimized to an investment that benefits all. Also, take note. Resource sufficiency applies not just to having enough staff for proposal development, but also to executive staff. Do they have enough attention to give and to fulfill their commitments to the process and make timely decisions? Sometimes organizational development is primarily a matter of eliminating executive bottlenecks. Different agendas. People differ on priorities. They differ on techniques. They differ on decisions about things like resource allocation. Ultimately they have different goals. Even though everyone may share the goal of winning a proposal, they may have other goals that differ. Expectation management is critical. Understanding why goals and techniques are chosen is critical. People will more readily accept a goal that is different from theirs when their own expectations have been voiced and they understand why the different goal is being pursued, even if they disagree with it. An effective process is more than just steps. An effective process also facilitates expectation management and guidance regarding why things should be done a particular way. Often why this should be done is more important than how. Sometimes an RFP can require you to do things differently, but often the goal will remain the same and it can be okay if people improvise or innovate what they do, so long as they achieve the goal. Who decides what quality is? Most companies treat proposal quality as a subjective thing and make it up as they go along. People debate, advocate, and discuss everything from word choice and style to bid strategies. They waste huge amounts of time talking in circles around the issues. Usually there isn’t one person who has the authority to declare it, and even if there is that just makes it one person’s opinion. You need a written definition of proposal quality that provides the criteria everyone should follow to achieve the kind of proposals the company wants to submit. Then people can argue over those criteria, but you can also settle those arguments by reviewing and publishing those criteria. This has the added benefits of informing proposal writers what they need to accomplish and giving you metrics you can use to track correlation with your win rate. It is a key part of expectation management.
  20. There is a place for short-term thinking. You've got to start somewhere. But there are also risks. And it's easy to become sidetracked along the way. This creates an opportunity to beat your competitors by continuing to evolve when they get stuck. Short-term proposal thinking Short-term thinking can trick you out of having an investment mentality. You’re not thinking about investing in your proposal capability. You may not even be investing in the win. You think short term when you’ve got a deadline, that’s all you're accountable for, and you only have the resources you’ve been given to prepare the proposal. You may be talking about winning, but your staff are probably more focused on just getting the darn thing submitted “while doing their best to win” so they can have their life back. If those are the cards you are dealt, that’s how you play it. A short-term proposal often involves making up a process as you go along. Re-using old proposal text that was good enough for the last proposal, is good enough for this one. If you are starting at RFP release with no information advantage, you may just have to fake your customer awareness. You are a hero. You get the job done. But your company sees you as a cost to be minimized. Short-term techniques for proposals include having a way of doing things (a process that’s not fully documented), re-using proposal text, and getting to a draft quickly so you can fix it. Your process is probably based on having one major draft review. The proposal succeeds or fails based solely on the strength of the participants. And whether they are heroes. If you only do proposals rarely and the values are not large enough to make them investment-worthy, you can get away with short-term thinking. When you start out with short-term thinking because you have no ability to invest, you risk having that become how you go about doing proposals when the volume picks up. This is a disease that infects both executives and proposal managers. The bridge to long-term proposal thinking is usually your business development and proposal process. Companies seek to formalize their process when they want to start doing things consistently better in the future. Mid-term proposal thinking Most established companies make it this far and never make it to long term thinking because it’s so easy to get sidetracked. I know billion-dollar companies that never matured past this level. Companies who get stuck here are often trapped in a process Catch-22. They haven’t documented their process that people won’t follow anyway. Even if it was documented, it would fail if followed diligently in the real world. Stuck in this trap, “The Process” seems to be the only way out. Mid term proposal thinking can become an eternal struggle around “The Process” while falling back on short term techniques to keep making proposal submissions. Just when you think things are going smoothly, staff turnover sets you back, demonstrating that your process was never independent of your people. With long term proposal thinking, gambling or relying on continuous heroics gives way to continuous win rate improvement. But only for those with the patience, perseverance, and strength required to build the right culture. Long-term proposal thinking Long-term thinking only happens when there is a multi-year commitment to continuous win rate improvement backed by the authority to drive change. By the time companies get to long-term thinking, they have usually fallen into the business development growth trap, with multiple locations, lines of business, and territories that have evolved different ways of doing things that are hard to standardize. You can’t begin long-term thinking until you can get everyone on the same page. When you do, wonderful things can be achieved with a long-term outlook: You recognize that winning is cross-functional and requires integration instead of stove pipes. You make everyone part of business development and the flow of information from lead identification through award. Proposals are not something a department does, they become something the entire company does. Instead of process as required steps, you create a process that based on guidance and staff development. Instead of steps, the process itself gets built around discovering the information needed to answer the questions that stakeholders have so they can make their contributions. You make it easier to follow the process than to complete an assignment without it by showing, accelerating, and inspiring so that people want to follow it. Instead of a rare event, training gets built into the process and becomes continuous and never-ending. It facilitates your ability to adapt and change by giving you a vehicle to inform and guide while doing. When there is a multiyear commitment, process and training combined become the engine for organizational change that drives continuous win rate improvement. Your review process begins to focus on risk mitigation and quality validation instead of document critique. You define proposal quality. You define quality criteria for your proposals and train your reviewers to be responsible for validating them. Quality gets built in at the beginning instead of fixes applied at the end. When there are alternatives reviewers and the proposal team collaborate to make decisions based on risk management and acceptance. You get good at analytics. Very few business development or proposal groups have any analytical capability. It is surprising how few companies have any analytical capability. In this context, analytics means quantifying process performance so that you can correlate decisions and activity with your win rate. Continuous change is senseless if it is uninformed. The things that actually impact your win rate are not always what you think they will be. What experts think they know is often contradicted by hard data. For long-term success, take the subjectivity about what you should do out of the equation and replace it with a culture of experimentation and analytics. This is necessary to make the big evolutionary shift from thinking about proposals as a cost and instead managing them as investments. How does resource allocation impact your win rates? Is more always better? Under what circumstances? When does spending more lead to higher win rates and when does it not? Winning proposals should be about maximizing the company's return on investment. But without analytics all you are doing is gambling on your future. Long-term thinking minimizes gambling in favor of steadily increasing returns. Evolving like this is something we help companies do. All the individual things we offer are part of a bigger plan. From our off-the-shelf process documentation, to our pipeline assessments, review participation, and integrated process, guidance, and training, they add up to transformation into an organization that maximizes its win rates. When you are too small to invest much, we have we have easy ways to get started. And once we have a relationship based on a proven ROI track record, we help our friends over the long term become truly great. Click the button if you are ready to think long term and want us to help your company win. Contact us about improving your company's ability to win
  21. PropLIBRARY contains a ton of information that can help you solve common business development, lead capture, and proposal problems. We give many solutions away. And some are part of our premium content. The list below is a mixture of links to free content and premium content that's only available with a finding leads before the RFP is released If you need to fix a broken proposal If you need to be more selective in what you bid If you are writing a proposal even though you don't know the customer as well as you should If you don’t have a lead capture process If you start your proposals unprepared If you are having trouble figuring out what to write in your proposal If you don’t know how to create an RFP compliance matrix If you don’t have a written definition of proposal quality If you need to prepare for a customer debrief If you need to conduct a proposal lessons learned review If you need guidance for staffing your proposal If you have to write a proposal but you don't have the background or input you need If your business development meetings are boring, stupid, and pointless If your proposals contain poorly written themes If your proposals are good, but not good enough If you don't have enough staff to work on your proposals If you just need to prepare a simple proposal If you think the RFP might be wired for someone else If you are having trouble doing proposals with other people involved If you need your proposals to be more persuasive If you need to respond to a Request for Information (RFI) or a Sources Sought Notice If you want to improve your process, but not replace it If your proposals do not reflect an information advantage If your proposals end up being merely compliant If you need to accelerate your proposals If you are struggling to find differentiators for your proposals If people are ignoring your company's strategic plan If people aren't following your proposal process If your proposal reviews aren't working out the way they should If you got a late start on your proposal If you're having trouble knowing what to propose If people are having difficulty completing their proposal assignments If you have to use people who aren't proposal writers to produce your proposals If your proposals are failing their reviews If you need to fill your bid opportunity pipeline
  22. We're adding features that will turn our huge library of incredibly useful content into online training courses with exercises, quizzes, videos, and more. But that's not all... There will be special features for Corporate Subscribers. They involve customization, certification, live instructor integration, real proposal participation tracking, transcripts, mentorship programs, and more. It's not just about training. They turn PropLIBRARY into a strategic tool for developing a winning organization. We'd like to tell you about it. And answer your questions. Thursday, October 27th at 11am EST (UTC-5), we're hosting a webinar to do just that. This webinar is just for large companies, and is only relevant to companies with at least 50 employees and will support companies with thousands of employees. A single user subscription is more relevant to smaller companies, and they'll be getting an upgrade too. We're also doing some things for all you freelance consultants out there that will send some love your way too. Click the button to request the webinar access details. And since it will be a small group, feel free to share something about yourself, your company, or your proposal challenges. And if you can't make it, use the button to let us know so we can email you some information. Click here to register for the webinar

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.