So I'm taking a project from Xcode 5 and porting it to XCode 7. I'm running into some very strange behavior where essentially I have a class that sets an integer instance variable on itself, and when it does, for some reason the integer value gets garbled (as if there's some strange pointer referencing going on).
Here's a basic gist of the code
@interface MyClass : NSObject
{
int myValue;
NSString *myString;
}
+ (id) defaultMyClass
- (BOOL) setUpMyValues
@end
@implementation MyClass
+ (id) defaultMyClass
{
return [[[MyClass alloc] init] autorelease];
}
- (id) init
{
self = [ super init ];
myString = nil;
return self;
}
- (BOOL) setUpMyValues
{
myValue = 21;
myString = @"real string value";
return true;
}
Going through lldb, what I end up with is
> p myValue
> (int) $1 = 3285152 // This number changes on each run
> p myString
> (NSString *) $3 = 0x003220a0 class name = MyClass
> p self
> (MyClass *) $5 = 0x0013a6f0
> p self[0]
> (MyClass) $6 = {}
I think the values I'm getting for self and self[0] are related to the issue. Looking at the list of variables in the debugger, expanding the contents of self just provides [0], so it looks kind of like this:
[A] self = (MyClass *) 0x13a6f0
[0] (MyClass)
//Nothing further expands.
I really have no idea what could be causing this issue. Does anyone have an idea?