This is actually the subject of a blog entry Mark will be posting at some stage soon, on why it’s there in Script Debugger 6. But the short answer…
The main role is so that the user will get an error if they try to run it a script on an unsupported version. So it’s a good way to document the version requirements of a script for both author and runner. If you’re not concerned about that, or you want your script to support very old versions, there’s no need for it – but especially with things like script libraries and ASObjC, presumptions of backwards compatibility are more tenuous.
It also means you have a
use statement when you start writing a script, so you probably need a
use scripting additions statement if you’re intending to use scripting addition commands. If you don’t have either, and you decide to start using a script library later on, you have to remember to also add the
use scripting additions statement, which in turn introduces some subtle changes (for the better) in behavior. And we think it’s generally better to start off with the new behavior to begin with, rather than risk surprises.
So on that basis, we think it’s modern AppleScript best practice, if you will.
And if you’re using ASObjC, Script Debugger’s code-completion uses the version information to exclude classes and methods that have been introduced since the version specified, so you don’t accidentally write scripts that won’t run where you need them to. If AppleScript introduces new commands, we will be able to do similar there.
You should never get an error if the version specified is earlier than that of the OS you’re running in; if you see that in Sierra, do log a bug with Apple.