Defaults driving me nuts

use AppleScript version "2.4"
use scripting additions
use framework "Foundation"

property userDefaults : a reference to current application's NSUserDefaults
property srcDF : 0

tell standardUserDefaults() of userDefaults
	log {"srcDF 0", srcDF}
	registerDefaults_({|sourceDF|:srcDF})
 	set srcDF to (objectForKey_("sourceDF")) as integer
 	log {"srcDF 1", srcDF}
end tell

Simple thing in Script Editor
log {“srcDF 0”, srcDF} --> 0
log {“srcDF 1”, srcDF} --> 0

In XCode for the real app
log {“srcDF 0”, srcDF} --> 0
log {“srcDF 1”, srcDF} --> 1

What do I miss???

I suspect you’ve set the value for the app built in Xcode at some stage. In the case of Script Editor, I’m not sure what you’re trying to prove — you’re changing its defaults.

Probably my example was bad.
My app doesn’t read (write) defaults

# some code...
do shell script "defaults read com.spherico.SRT-Editor.plist tgtDF" as integer
--> 0
tell standardUserDefaults() of current application's NSUserDefaults
set theDF to (objectForKey_("tgtDF")) as integer
--> 1
end
# more code

What do you see if you run this:

set defaults to current application's NSUserDefaults's standardUserDefaults()
set theDict to defaults's dictionaryRepresentation()
current application's NSLog("%@", theDict)

I get the system settings

2018-08-20 09:43:24.753997+0200 SRT Editor[10337:1305063] {
    AKDeviceUnlockState = 0;
    AKLastCheckInAttemptDate = "2018-07-26 16:22:48 +0000";
    AKLastCheckInSuccessDate = "2018-07-26 16:22:50 +0000";
...
    AppleMeasurementUnits = Centimeters;
    AppleMetricUnits = 1;
    AppleMiniaturizeOnDoubleClick = 0;
    AppleScrollerPagingBehavior = 1;
    AppleShowAllExtensions = 1;
    AppleTemperatureUnit = Celsius;
CGDisableCursorLocationMagnifi

I wonder if you’re misunderstanding what registerDefaults: does. See if this helps:

set defaults to current application's NSUserDefaults's standardUserDefaults()
defaults's removeObjectForKey:"someKey" -- wipe any current value
defaults's registerDefaults:{someKey:true} -- set default
current application's NSLog("default %@", defaults's objectForKey:"someKey")
defaults's setObject:false forKey:"someKey" -- set value
current application's NSLog("false %@", defaults's objectForKey:"someKey")
defaults's setObject:true forKey:"someKey" -- change value
current application's NSLog("true %@", defaults's objectForKey:"someKey")
defaults's registerDefaults:{someKey:false} -- change default, which will do nothing to current value
current application's NSLog("current %@", defaults's objectForKey:"someKey")
defaults's removeObjectForKey:"someKey" -- wipe the current value, so it falls back to default
current application's NSLog("default %@", defaults's objectForKey:"someKey")

set defaults to current application's NSUserDefaults's standardUserDefaults()

I did use user defaults since the old days of ASS and all the apps I created with bindings to the UDs work fine - often enough there had been a real lot of them.
So why inside the applicationWillFinishLaunching block the above will call system UDs?

It includes all domains, including NSGlobalDomain.

Hmm,

I fear I don’t understand.

If I create an app in Xcode and want to set/call/change it’s user defaults it will always will include the global ones???
That wouldn’t make sense to me.

I suggest you hunt up the documents, perhaps starting here:

https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/UserDefaults/AboutPreferenceDomains/AboutPreferenceDomains.html

The important things to understand is the concept of domains, and how the registration domain effectively acts as a fall-back.

Thanks for the link.

I started a brand new project, cleaned all XCode stuff I could find.
Then copied the code to the new project and everything is fine.

Thanks again for all your patience.

I’ve read thru the page - I clearly do understand the user defaults/preference usage.
Nothing was wrong with my code.
Obviously I didn’t describe the problem correctly and/or posted wrong examples.

Anyway - everything works now except a little thing.

	on popUpAction:sender
		set curVal to (defaults's objectForKey:"tgtFpsID")
		set curVal to curVal as integer
		if curVal is in {3, 6} then
			set enableTgtDF to 1
		else
			set enableTgtDF to 0
		end if
		defaults's setObject:enableTgtDF forKey:"enableTgtDF"
		defaults's setObject:0 forKey:"tgtDF"
	end popUpAction:

popup

The popup’s content and selected index are stored in the user defaults and connected via bindings in IB.
The check box’s state and enable status are stored in the user defaults and connected via bindings in IB.

So the action will/should cause to set the check box’s state to 0 and either enable or disable the checkbox.
This only works randomly.
BUT binding the hidden status of the check box to not enableTgtDF always works perfect.
Any ideas.

It could be that you’re using setObject:forKey: but not a (Cocoa) object. Try something like this:

		set curVal to (defaults's objectForKey:"tgtFpsID")
		set curVal to curVal as integer
		if curVal is in {3, 6} then
			set enableTgtDF to true
		else
			set enableTgtDF to false
		end if
		defaults's setBool:enableTgtDF forKey:"enableTgtDF"
		defaults's setBool:false forKey:"tgtDF"
	end popUpAction:

tgtDF always works as expected. enableTgtDF always need “few actions” to make the check box disabled, while to pass that value to “Hidden” with “NSNegateBoolean” always works.
It’s not an issue in this special case but could be at a later state.

I will give it a try.

		if curVal is in {3, 6} then
			set enableTgtDF to true
		else
			set enableTgtDF to false
		end if
		defaults's setBool:enableTgtDF forKey:"enableTgtDF"
		defaults's setObject:0 forKey:"tgtDF"

Works perfectly. Thanks.