In my work on SSDs, I have the opportunity to test a wide variety of storage software from parallel filesystems to high performance databases. SSD device vendors are always interested in demonstrating compatibility and (ideally) great performance at the application level. In my time working on "the other side of the wire," there are three things I wish every storage software vendor provided to benefit the relationship between SSDs, the application, and ultimately end users:
1. Create a storage-specific microbenchmark. This benchmark should use as much of the application's IO stack as possible and focus on the aspect(s) of performance most relevant to the application. It should measure a very limited number of metrics (or even a single key metric). The hardware requirements for the benchmark should be kept modest, using the minimum quantity of SSDs and servers. If your benchmark requires 6+ servers and multiple clients, few will want to run it more than once (if at all), let alone run it regularly. A well-written storage microbenchmark helps SSD vendors articulate the value of their products in metrics that are important to the application and helps end users validate the performance claims of an SSD device vendor without the complexity and cost of testing at full application scale. Clever storage software vendors will create scores and scoreboards to encourage vendors to promote their results. Microbenchmarks can be put in the SQA flow as extended performance monitors to guard against application-specific performance regressions. One of the best examples of a well-executed storage microbenchmark is the Aerospike Certification Tool (ACT) It uses tail latency as its figure of merit.
2. Create a storage certification or compatibility program. This certification can be done in-house, provided as a self-certification to vendors, or enabled through a 3rd party. Each have their advantages. The ultimate goal is to help end users make device selections that improve their chance of first-pass success when deploying the application. Create a logo program and/or certification/compatibility list so that SSD vendors can promote that they work with the application. The certification can be in written form (i.e., a test plan), but is ideally partially or fully automated. The more that it is automated, the easier it is to run frequently, or integrate into regular regression testing. VMware (like them or not these days) has a very complete certification program. Once passed, vendors get listed on a public certification site and gain access to a logo program (e.g., "VMware Ready"). Not all certifications must be nearly as complex to add significant value. A simple compatibility test goes a long way in ensuring first-pass customer success.
3. Create an SSD specification. The SSD specification should outline all the requirements needed to operate successfully in the application. Beyond hard requirements, an SSD specification can also provide guidance on what is desired in the future. This gives SSD device vendors something to consider when designing new products or prioritizing new feature development. The largest example of such a specification is the OCP Datacenter SSD Specification developed by Meta, Microsoft, and other large organizations (the latest version can be found in the OCP contribution database). While this specification is extremely large, even a couple thoughtful pages provided by a storage software provider helps SSD device vendors understand what is needed to operate successfully with the application. Requirements listed in an SSD specification can and should be validated by a certification or compatibility test.
TL;DR: My dream conversation with a storage software vendor would start with something like "here is my SSD specification, here is a microbenchmark you can run to see how you compare to other SSD devices, and here is a certification test you can run to get on our compatibility list."