Threshold Solution Criteria – in order to be considered.
- Submissions must provide a technical solution, rather than a policy or legal solution.
- The tool must work on home IoT devices that currently exist on the market.
- The tool must protect information it collects both in transit and at rest.
- The Submission must address how the tool will avoid or mitigate any additional security risks that the tool itself might introduce into the consumer’s home by, for example, probing the home network or facilitating software upgrades.
a. The Abstract. The abstract should include a title for the Submission and a brief explanation of how the tool functions.
b. The Video. The video need only demonstrate how the tool would be used with one (1) IoT device that is likely to be found in consumers’ homes.
The video must address the Judging Criteria and: i) state what the tool is specifically designed to do; ii) describe the set-up for the demonstration and any assumptions the Contestant has made about the capabilities and limitations of the device(s) for the demonstration; and iii) explain what impact the tool would have on software of IoT devices beyond what is demonstrated in the video.
c. The Detailed Explanation must provide sufficient material so that the Judges can evaluate the tool properly for how well it works, how user-friendly it is, and how scalable it is.
Judging Criteria (a summary, see the Rules for complete details):
(i) How well does it work? (60 points out of 100 total score)
a. How well does your Submission address each of these four (4) components?
- Recognizing what IoT devices are operating in the consumer’s home. A tool may automatically recognize devices or provide instructions for consumer input.
- Determining what software version is already on those IoT devices. A tool may automatically recognize the software version or provide instructions for consumer input.
- Determining the latest versions of the software that should be on those devices.
- Assisting in facilitating updates, to the extent possible.
b. WILDCARD: If your Submission does not address the four components above, but offers a technical solution to address vulnerabilities caused by unpatched or out-of-date software of IoT devices in the home, the Contestant may demonstrate how that tool would work and argue for the superiority of the tool based on its level of innovation and impact on IoT security in the home. Any such WILDCARD option would also need to meet the criteria set forth in sections 7(ii)-(iii) (user friendliness and scalability requirements).
(ii) How user-friendly is your tool? (20 points out of 100 total score)
a. How easy is your tool for the average consumer, without technical expertise, to set up and use? In assessing how easy the tool would be to use, the Judges will take into consideration whether functions are performed automatically, without action by the consumer.
b. In analyzing the user-friendliness of the tool, the Judges will also take into consideration how well the tool does the following:
- Displays or conveys information about which devices it has assessed.
- Accurately communicates the risk mitigation provided by the tool (e.g., it should not give the impression that it solves all security problems).
- Allows consumers to control any information being sent to a third party, to the extent that any such information is being sent. This includes making short, but accurate, disclosures about the information flow.
(iii) How scalable is your tool? (20 points out of 100 total score)
a. The Submission must explain how the tool could be used for products other than those addressed specifically in the Submission.
(iv) Optional items (up to 10 bonus points)
The Submission may also address other ways to help consumers guard against broader security vulnerabilities in IoT device software in their homes.