In 2008, I led a technical investigation with application engineers and external vendor engineers to resolve a compliance failure issue with the vendor’s upcoming graphic card. Following an analysis, I discovered the root cause was due to the vendor using an outdated and non-standard testing methodology. I, therefore, advised the vendor to switch to an updated methodology which showed passing results.
However, the vendor’s engineers could not accept this explanation, and I was left with a predicament. On the one hand, I could continue to push the vendor to adopt the new methodology. However, this could frustrate the vendor and jeopardize our business relationship. On the other hand, if I accepted the vendor’s request to comply with the older methodology, the production date would have to be delayed to accommodate the modification.
This was when I realized the fallible tendency of members in a team setting to focus excessively on the process and let the divergence in opinions and approaches obscure the ultimate goal. Since the sole purpose of a compliance test is to ensure the interoperability between graphic cards and monitors, discrepancies in different methodologies are irrelevant as long as the end-user visual experience is not compromised.
Therefore, I adopted the third, consumer-oriented approach and checked the graphic card with various brands of TV monitors to verify the passing visual quality. Based on this data, the vendor finally agreed to waive the failure status.
With this experience in mind, I expect myself to perform a similar task of conflict resolution in LBS. Since the first-year study group students come from diverse ethnic and professional backgrounds, divergence in opinions is to be expected. I wish to provide the group a level of objectivity to look past the discrepancy and approach resolving issues strategically with the overall task objective in mind.