Two years later, Is Your MVP Really Still Viable?
Date
Cutting corners to launch your MVP quickly may speed up getting feedback and sales, but neglecting those unaddressed elements can haunt your business with costly setbacks long-term.
I’m a strong believer in the minimal viable product (MVP) strategy. Software development is complicated so it makes good business sense to make tough choices and stay focused on the most important user needs first in the most efficient way possible. While this leads to leaving things out of an MVP, one may presume a future release will remedy the matter. Unfortunately, many important considerations can get overlooked and set aside in the rush to go to market and add new features. Fast forward two-three years and now your scrappy MVP is unable to scale or meet the regulatory requirements of your customers because of those earlier omissions.
Which leads to a question about the “V” in MVP — is your MVP really still viable? I’ve seen some really “interesting” source code working on applications that we have taken over and rehabilitated. One thing in common is that many of the problems—technical debt, performance inefficiencies, and data privacy liability to name just three— were symptomatic of putting feature development ahead of everything else.
We help clients assess the extent of the problem (and its consequences for the bottom line) and give actionable recommendations. Here are some of the questions we ask when making that assessment.
The First, Critical Question
Can you even make changes to your application?
If it’s been several years since you launched the MVP, your developers may have moved on and the working knowledge and ability to build and publish new code may have moved on with them. This is a pretty common scenario, especially if your application was developed offshore or by temporary contractors. Here are some other questions to determine the true “updateability” of your product.
- Is the latest version of the application code available and in version control?
- Is there documentation explaining the steps necessary to build and publish the code? If so, do they still work?
- Do you have login credentials, SSH keys, code signing certificate, or other authentication secrets needed to access the hosting infrastructure or app store?
- For mobile apps, does your app meet Apple’s or Google’s latest API or OS requirements? Does your software integrate with third-party services like Pendo, Google Analytics, reCaptcha, or handle SSO providers like Google, Microsoft, Clever, or Classlink? If so, what version of the external service is your software using and how long will it be supported? Also, do you have login credentials for those third-party services?
- How much of the quality assurance testing is automated? Are there test cases describing the expected behaviors of the software?
Changing Regulatory Requirements
Something that was excluded at the time of the MVP launch (e.g., because it was an optional best practice) may very well have become an industry standard or a regulated requirement by now. Adhering to them is important for the benefit and safety of your end-users and for your business, since school districts frequently require compliance as part of adoption. There are three key areas that should be evaluated.
Security
- What version of the software framework, platform or libraries was the application built on?
- Have there been security releases to those libraries since the MVP launched? Are those versions still supported?
- If there’s been any gap in maintaining these, what vulnerabilities were left open and how might the application or the data within it have been left open to exploitation or compromise?
- Does any custom source code (i.e., code developed from scratch) show an understanding of the OWASP top 10? For example, do database queries use a wrapper, do forms have CSRF protections, is user input validated and user output filtered, are passwords secured with a salt and one-way encryption?
- Are security events and other important system activity logged in an auditable way?
- Is admin access to the application and its infrastructure limited to those who need access (and not via shared usernames or passwords)?
- Is there a plan in place to help mitigate a security vulnerability or denial of service attack?
Privacy
- What personally identifiable information (PII) is stored in the application?
- Is that data encrypted when transmitted and stored?
- Is information collected and used in a way that complies with COPPA, FERPA, GDPR, and other government regulations?
- Does the application send emails that contain PII?
- Are there any copies of the application on other test servers or developer computers? If so, has the data in those copies been scrubbed of real-user PII?
- If the application has PII, do you have a response plan in case of a security incident that allows access to PII?
Accessibility
- Will a person with impaired vision or hearing be able to navigate the application?
- Does the application source code use the correct markup, alt tags, and aria attributes to support screen readers, keyboard navigation, and other accessibility supports.
- If the application has content being added by staff, have they received training on how to create accessible content?
- Does the application have user-generated content? If so, how are you ensuring its accessibility?
Longer Term Concerns
If your hope is to turn your aging MVP into something sustainable long-term, then consider a range of other maintainability factors.
Having built and supported applications that have thrived for decades without costly overhauls, still serving hundreds of thousands of educators monthly, we’ve learned that long-term support isn’t just about its source code, but also how that product fits into a long-term strategic plan. Some key questions to consider are:
- Is the source code commented and organized in a way that makes it easy to comprehend — especially if it’s only edited every few years?
- Do the technologies used for this product align with your broader technology toolset and expertise?
- What is the health of the software ecosystem your product is built upon? Does the shelf-life of that source code mean it’s going to become a lot more expensive to maintain?
- Do you have analytics data, transcripts from focus groups or interviews, or customer support inquiries that could inform how your product is actually being used? (Several years after launch, the use cases imagined rarely match reality.)
- Does your solution still fill a market need or has your business shifted in ways that have introduced friction or incongruity? Is the solution still competitive in the current market landscape?
- Are there some software features that aren’t used any more? Can maintainability and security be improved by simplifying the product and removing those?
This is a big list and it’s only really a starting point. If it seems intimidating, the good news is a lot of this can be answered fairly quickly by an experienced team. If you find yourself still treating your product like a MVP two years later, then it’s time to evaluate and address the technical debt accruing behind the scenes. Our team has more than 30 years of combined experience in the education technology industry. Reach out today for a free consult.