Jump to content

Working towards a Mac release...


Recommended Posts

I'm almost ready to release a "beta" bugfix version of Blades of Exile for the Mac.

 

There's one major problem, though – when opening it from the Finder, nothing happens. The icon may appear briefly in the Dock, but then it immediately closes.

 

I managed to debug by outputting the error message to a file rather than stdout, and it seems that it's getting a nsvErr result code when attempting to open Blades of Exile Sounds... yet it has no problem with Blades of Exile Graphics, and the code is essentially identical.

 

Any ideas?

 

 

(By the way, is it actually necessary to call InitCursors()?)

Link to post
Share on other sites

Try loading something else than Sounds after loading Graphics. If it behaves the same way, it's likely the problem occurs while loading the first ressource.

Would it somehow mess with the directory ?

 

Does those lines give an error ?

Quote:
char *path = "Scenario Editor/Blades of Exile Sounds";

error = FSPathMakeRef((UInt8*) path, &gRef, false);

By the way, i assume you checked all your ressources ...

 

Chokboyz

 

Link to post
Share on other sites

It looks, from the code Chokboyz quoted, like you're using a relative path. Could you instead get the path to the executable, and then by appending the relative path to it, end up with an absolute path that will work regardless of the current working directory?

 

(As I've mentioned before, I'm suspicious that the working directory may not be the same under these two different launch scenarios.)

Link to post
Share on other sites
Originally Posted By: Niemand
(As I've mentioned before, I'm suspicious that the working directory may not be the same under these two different launch scenarios.)

Shifting directory was exactly the Windows problem with launching custom scenarios.
The fix made was the one that you mentioned, so we're suspecting the same thing here smile

Chokboyz
Link to post
Share on other sites
Originally Posted By: Niemand
(As I've mentioned before, I'm suspicious that the working directory may not be the same under these two different launch scenarios.)
Actually, based on where my log file appeared, I'm forced to conclude that the working directory is the root directory! Yet, that doesn't explain why Sounds got the error and Graphics didn't.

Still, the code to get the path to the application package's current directory does appear later in the function, so I'll move it around and try it.

EDIT: And yay! It works now.
Link to post
Share on other sites
Originally Posted By: Celtic Minstrel
Yet, that doesn't explain why Sounds got the error and Graphics didn't.

It's simply that Graphics loading somehow mess with the working directory wink

Originally Posted By: Celtic Minstrel
And yay! It works now.

Good job smile
And on to the release ?

Chokboyz
Link to post
Share on other sites

Well, once I test it, I'll release it. wink

 

I haven't yet written the code to read and write creature_save, stored_items, or the conversation and encounter notes, so these will be lost with save and reload. I'll fix that soonish.

 

Also, I don't consider the save format completely finalized, partly because of its incompleteness and partly because I still haven't finished the refactoring. I want to replace all C-strings with STL-strings, replace many arrays with vectors, and possibly merge creature_data_type with creature_start_type.

 

Hopefully the release will be tonight or tomorrow.

 

EDIT: Oh, if someone could post a Windows save file, or point me to where I can find one, that'd be helpful...

Link to post
Share on other sites

Good question... I'll check...

 

EDIT: Wait, I might have forgotten to call it. That could be a problem...

 

EDIT2: Yes, it works now. It displays at the top left of the screen rather than centred, but that's a minor issue that can be fixed later (though it's probably quite simple to fix).

 

Now I'm getting EXC_BAD_ACCESS on fwrite... (EDIT3: Actually, on flockfile() which is apparently called by fwrite) so, I think I'll just release the version which can't save. If I manage to fix it quickly I may replace that file with a new one.

 

It can be found here. Any bugs found may be reported either in this thread or on this page. It can't save at the moment, but that may change by tomorrow if I get it fixed by then. It can however load old-format saves, and it should be able to load Windows saves as well as Mac ones (though I haven't tested this).

 

Oh... if anyone has a PowerMac, could they test to see if loading works? Preferably trying it with both a Windows and Mac save.

Link to post
Share on other sites

Thanks! That should be helpful.

 

I'm running into a problem with saving – I can't seem to get the uncompressed 10 byte header onto the file. I'm thinking I may need to write to a buffer instead (using deflate() instead of gzwrite(), maybe) and then write the header and the buffer to the file.

 

 

EDIT: And loading Windows saves does not work, so I'll need to work on that too.

Link to post
Share on other sites

Wasn't the idea to put the header inside the compression? I guess that would make the internal tar format invalid though. Even with the purely technical details aside, I don't see any way to reasonably include a header and make the file be both valid gzip format and tar format. Which in turns brings into question even bothering to try to mimic both formats at all. This may be something to think about.

 

With regard to writing a header at all: Can you not open the file with fopen(), write your uncompressed data, then pass the file descriptor to gzdopen() and continue by writing compressed data? What exactly goes wrong?

Link to post
Share on other sites
Originally Posted By: Niemand
With regard to writing a header at all: Can you not open the file with fopen(), write your uncompressed data, then pass the file descriptor to gzdopen() and continue by writing compressed data? What exactly goes wrong?
I tried that, but for some reason it didn't work. I also tried using gzflush(), but that didn't work either; the header was compressed anyway.

...Actually, maybe it wasn't. It appears that it simply appears after a gzip header. Which is a problem; perhaps the solution is to avoid gzwrite().

Originally Posted By: Niemand
Wasn't the idea to put the header inside the compression? I guess that would make the internal tar format invalid though. Even with the purely technical details aside, I don't see any way to reasonably include a header and make the file be both valid gzip format and tar format. Which in turns brings into question even bothering to try to mimic both formats at all. This may be something to think about.
The reason for the header not being compressed was so that the game could open the file and check the header to determine whether it is in the new or old format. Also so that, with the scenario format, the old version can give the "Created with later version" message.

As for the question of why bother to mimic the formats... I guess there's no good reason to mimic the tar format (I could just as easily create my own archive-like format that doesn't align to 512 bytes). The purpose of the gzip format would be the compression, but I suppose I could compress the data without writing as a gzip format. That would mean I need to use either compress() or deflate() rather than gzwrite().





By the way, is there any way to get the *stream libraries to read and write char variables as numbers rather than characters? Many of the values are stored as char or unsigned char, and I'd rather not have control characters appearing in the sub-files. Obviously I could cast to int on output, but what about input? Must I read into an int and then convert? Or is there an easier way?
Link to post
Share on other sites

Do you really need an uncompressed old style header for backwards semi-compatibility? Would it be sufficient to verify that the gzip or other compression header fails to match the header that old versions of the game expect sufficiently that those old versions will realize that they can't read it? (Indeed, the headers do mismatch sufficiently, the gzip header's first two bytes, interpreted as a short, have a value of 8075, and it appears to me that the original BoE code rejects any save file whose first two bytes are not either 5790 or 1342.)

 

With regard to your last point: I take it what you want to do is things like:

Code:
	char num=50;	some_ostream << num;	//wanting the file to contain "50", instead of "2"	some_istream >> num;	//wanting to get 50, not just 5

Obviously you are right about output, just cast to int. For input, the best I can come up with at the moment is read into an int, do bounds checking, and then store the result to the char variable. Note that if you use the stream classes' formatted input capabilities you also really must check the fail bit after every read to make sure that you actually got input. Alternatively, you could turn on exception throwing for the stream, and just write code to handle exceptions. I've been meaning to get around to trying that, but haven't yet.

Link to post
Share on other sites
Originally Posted By: Celtic Minstrel
And loading Windows saves does not work, so I'll need to work on that too.

It seems you only check if the file is new format in your code (load_party() in FILEIO.cpp).
Because your processor is Intel and the Windows file doesn't have a 0x0B0E town flag, it is considered a legacy Mac file and then treated as big endian ... It should obviously not smile
Or am i mistaken ? confused

I wonder if i should implement this backward cross-platform compitibility too ...

Chokboyz
Link to post
Share on other sites
Originally Posted By: Niemand
Do you really need an uncompressed old style header for backwards semi-compatibility? Would it be sufficient to verify that the gzip or other compression header fails to match the header that old versions of the game expect sufficiently that those old versions will realize that they can't read it? (Indeed, the headers do mismatch sufficiently, the gzip header's first two bytes, interpreted as a short, have a value of 8075, and it appears to me that the original BoE code rejects any save file whose first two bytes are not either 5790 or 1342.)
You're right, it doesn't actually matter much for the savefile format, because the original Blades of Exile did not include code to recognize that a save file was created with a later version of the game. However, it does include code to recognize if a scenario was created with a later version of the editor. Therefore it is important to retain the original header so that if anyone tries to open the scenario with an older version they get a "created with later version" error instead of a "not a BoE scenario" error.

I think I'll try building up the file in memory, and compress it using the compress() function. Then I can write the whole thing to the file with fstream.

Originally Posted By: Niemand
Obviously you are right about output, just cast to int. For input, the best I can come up with at the moment is read into an int, do bounds checking, and then store the result to the char variable. Note that if you use the stream classes' formatted input capabilities you also really must check the fail bit after every read to make sure that you actually got input. Alternatively, you could turn on exception throwing for the stream, and just write code to handle exceptions. I've been meaning to get around to trying that, but haven't yet.
I'm not doing any error-checking at all on input at the moment, but that's on my to-do list.

Originally Posted By: Chokboyz
It seems you only check if the file is new format in your code (load_party() in FILEIO.cpp).
Because your processor is Intel and the Windows file doesn't have a 0x0B0E town flag, it is considered a legacy Mac file and then treated as big endian ... It should obviously not smile
Or am i mistaken ? confused


My code is something like this (I stripped out all the code to check the values of the flags and update the booleans appropriately):
Code:
if(mac_is_intel && flags.a == 0x0B0E){ // new format    format = new_oboe;}else if(!mac_is_intel && flags.a == 0x0E0B){ // new format    format = new_oboe;}else if(flags.a == mac_flags[0][0] || flags.a == mac_flags[0][1]){ // old format    if(mac_is_intel){ // it's actually a windows save        format = old_win;    }else{ // mac save        format = old_mac;    }}else if(flags.a == win_flags[0][0] || flags.a == win_flags[0][1]){ // old format    if(mac_is_intel){ // it's actually a macintosh save        format = old_mac;    }else{ // win save        format = old_win;    }}else format = unknown;
Since the 0x0B0E flag is not present, the first two ifs should be jumped over. Then it compares the first flag to check if it's a valid save. Since Windows uses the same flags as Mac, this check should succeed in two cases: the file is a Mac file, but the processor is PPC; or, the file is a Windows file, and the processor is Intel. The latter is the case here, so it enters the if statement and then checks to see if the processor is Intel (it is). And then it sets the format to old_win.

In theory, anyway. I'll have to debug it to see if it actually works correctly.


Originally Posted By: Chokboyz
I wonder if i should implement this backward cross-platform compitibility too ...
It's not necessary, but it could be nice. You would have to swap the mac_flags and win_flags arrays from my code, and you would only need the branches that execute if mac_is_intel is set to true (hence you don't even need the variable). You'd change the name of the existing flags array to win_flags, and make mac_flags the array of the byte-swaps of the elements of win_flags. If the first flag is 0x0B0E, it's new format; if it's win_flags[0][0] or win_flags[0][1], you have an old windows save; and if it's mac_flags[0][0] or mac_flags[0][1], it's a mac save. Then you'd just need to do the porting process iff the save is a Mac save (you can take my port_* functions and my flip_long function from porting.cpp if you like).
Link to post
Share on other sites
Originally Posted By: Celtic Minstrel
Then it compares the first flag to check if it's a valid save. Since Windows uses the same flags as Mac, this check should succeed in two cases: the file is a Mac file, but the processor is PPC; or, the file is a Windows file, and the processor is Intel. The latter is the case here, so it enters the if statement and then checks to see if the processor is Intel (it is). And then it sets the format to old_win.

I'm a little confused here ... confused
Since the flags are the same on the Windows and Mac, how is the code supposed to recognized them ? (the processor being Intel is not a valid check : an old format Mac game would be flagged as Win old format as soon as the processor is Intel ...)
And aren't win_flags[] and mac_flags[] the same thing ?

I must miss something ...
Chokboyz

P.S : i will also need Mac saves for testing if I decide to implement this wink
Link to post
Share on other sites

The mac_flags array in my code is the original flags array. The win_flags array is essentially the same, except every element is byte-swapped.

 

The key difference between the Mac saves and the Windows saves is the endianness. The Mac saves were created by a big-endian program, hence if the same file is read on a (little-endian) Windows computer the relevant portions will need to be byte-swapped for it to make any sense. The same is true if the Mac save is read on an Intel Mac. Therefore, if we're in little-endian (either Windows or Intel Mac) then the presence of the original flags indicates a Windows save. If we discover the flags are in fact byte-swapped, then it's a Mac save.

 

You don't need to worry about the possibility that your code may be run in a big-endian environment (I don't think...); therefore your code would not need to consider as many cases as mine does. Hence, if you find the normal flags, it's a Windows save. If they're byte-swapped, it's a Mac save.

 

 

 

Now, for some obscure reason, sentry::sentry() called by operator << is raising EXC_BAD_ACCESS when attempting to insert a constant string into the output stream. It worked fine before – I can't figure out what's different now that's causing it to fail.

 

EDIT: Replacing the c-string with an stl-string resulted in the following error message:

Code:
terminate called after throwing an instance of 'std::ios_base::failure'  what():  basic_ios::clear
That sounds as if an exception was thrown because there was no error... smirk
Link to post
Share on other sites

What stream is this? Also, are you sure that the stream in good condition before you tried to write the string, i.e., did you check the fail bit and the bad bit?

 

With regard to the endianness: It seems like the issue is muddled because system endianness is conflated with operating system. Why not have the code always store and read data in a single endianness, and only swap if it knows that it's on hardware of the opposite endianness?

 

Since a binary of the software will only actually be compiled to run on hardware of a single endianness, you could include endianness swaps conditionally at compile-time based on target architecture (this isn't too hard), or you could do it the quick-and dirty way, making the swaps be conditional on a runtime check of the current endianness.

 

In short, I think it's much better to write code that will compile to run on any reasonable hardware, rather than inserting artificial differences into the Macintosh and Windows code. Yes, only the Mac OS code is likely to ever run on a big-endian system, but you never know, and doing it the other way is easier in the short term anyway.

Link to post
Share on other sites
Originally Posted By: Niemand
What stream is this? Also, are you sure that the stream in good condition before you tried to write the string, i.e., did you check the fail bit and the bad bit?
It's a stringstream - good() returned true, and then it crashed anyway. (Now it's back to EXC_BAD_ACCESS in sentry.) Called clear() before writing didn't help either. (Oddly, if I call neither good() nor clear() first, I get the exception; otherwise, I get the EXC_BAD_ACCESS.)

Originally Posted By: Niemand
With regard to the endianness: It seems like the issue is muddled because system endianness is conflated with operating system.
Yeah, I guess I have done that. I could rename variables to make it clearer that it's really the endianness rather than the architecture that it depends on.

Originally Posted By: Niemand
Why not have the code always store and read data in a single endianness, and only swap if it knows that it's on hardware of the opposite endianness?
The code reads and writes in little-endian, except that when an old-format Mac file is encountered, it's interpreted in big-endian. I'm even going to go so far as to write the Windows flags in the scenario header, so that the flags identify endianness rather than OS.

Originally Posted By: Niemand
Since a binary of the software will only actually be compiled to run on hardware of a single endianness, you could include endianness swaps conditionally at compile-time based on target architecture (this isn't too hard), or you could do it the quick-and dirty way, making the swaps be conditional on a runtime check of the current endianness.
Well, it's actually a universal binary now, but it seems to compile every file twice, so I guess you're right. I'm doing a runtime check for architecture at the moment, using Gestalt.

Originally Posted By: Niemand
In short, I think it's much better to write code that will compile to run on any reasonable hardware, rather than inserting artificial differences into the Macintosh and Windows code. Yes, only the Mac OS code is likely to ever run on a big-endian system, but you never know, and doing it the other way is easier in the short term anyway.
Okay, so I should probably replace that Gestalt architecture test with an endian test. There's a simple way to do it with a union, I believe...


...but first I would like to get this saving to work properly!
Link to post
Share on other sites

The runtime endianness check is perfectly good, it's not like this is really critical for performance. (You're right about the Universal binary basically compiling files twice, so if you use a compile-time check, it would just do the check in both, with different results in each, I think.) Indeed, given that the way you describe it as being handled sounds like exactly the way I would want it done, so I don't see where there's even a problem, unless it's that the Windows code is hard-coded to a single endianness.

 

Anyway, let's try to tackle you crash. As I have personally never succeeded in triggering a crash within the iostream code itself, I must admit to being rather baffled. Could you post the relevant code?

Link to post
Share on other sites

Well, technically the Gestalt is checking for architecture, not endianness, so since we really want to check for endianness it's probably a good idea to change that. Probably not high priority though.

 

 

As for the code causing a crash (the whole of the function up to the point of the crash, excluding commented lines):

Code:
bool in_scen = univ.party.scen_name != "";bool in_town = univ.town.num < 200;bool save_maps = !univ.party.stuff_done[306][0];FSRef dest_ref;char* path = new char[200];FSpMakeFSRef(&dest_file, &dest_ref);FSRefMakePath(&dest_ref, (UInt8*)path, 200);ofstream fout(path);short flags[] = {	0x0B0E, // to indicate new format	in_town ? 1342 : 5790, // is the party in town?	in_scen ? 100 : 200, // is the party in a scenario?	save_maps ? 5567 : 3422, // is the save maps feature enabled?	0x0100, // current version number, major and minor revisions only	// Version 1 indicates a beta format that won't be supported in the final release};if(!mac_is_intel) // must flip all the flags to little-endian    for(int i = 0; i < 5; i++)        flip_short(&flags[i]);fout.write((char*) flags, 5*sizeof(short));delete path;header_posix_ustar header;static char footer[sizeof(header_posix_ustar)] = {0};ostringstream sout(""), bout("");string tar_entry;size_t tar_size, y;int x;sout << string("");
Replacing it with sout << "" also gets the crash. The first time the crash occurred, though, the string being inserted was "AGE " in a function called on the next line – I added this sout for debugging purposes only.
Link to post
Share on other sites

As a tangential observation, is univ.party.scen_name an std::string? Because if not, that first line is not likely to do anything useful.

 

I cannot, however, find anything wrong with this code. I even put together a mockup program, which ran fine without crashing:

Click to reveal..
Code:
#include <iostream>#include <sstream>#include <fstream>#include <string>#include <cstdio>using namespace std;struct header_posix_ustar {		   char name[100];		   char mode[8];		   char uid[8];		   char gid[8];		   char size[12];		   char mtime[12];		   char checksum[8];		   char typeflag[1];		   char linkname[100];		   char magic[6];		   char version[2];		   char uname[32];		   char gname[32];		   char devmajor[8];		   char devminor[8];		   char prefix[155];		   char pad[12];	   };void flip_short(short* s){	*s=((*s&255)<<8)|((*s>>8)&255);}int main(){	bool in_scen = true;	bool in_town = true;	bool save_maps = false;	bool mac_is_intel=false;		char* path = new char[200];	sprintf(path,"./temp.dat");	ofstream fout(path);	short flags[] = {		0x0B0E, // to indicate new format		in_town ? 1342 : 5790, // is the party in town?		in_scen ? 100 : 200, // is the party in a scenario?		save_maps ? 5567 : 3422, // is the save maps feature enabled?		0x0100, // current version number, major and minor revisions only		// Version 1 indicates a beta format that won't be supported in the final release	};	if(!mac_is_intel) // must flip all the flags to little-endian		for(int i = 0; i < 5; i++)			flip_short(&flags[i]);	fout.write((char*) flags, 5*sizeof(short));	delete path;		header_posix_ustar header;	static char footer[sizeof(header_posix_ustar)] = {0};	ostringstream sout(""), bout("");	string tar_entry;	size_t tar_size, y;	int x;	sout << string("AGE") << endl;		return(0);}

 

Are you sure the crash is where you think it is?

 

Also: My version of flip_short is 25% faster at -O0 and 75% faster at -O3 than Jeff's. Yes, I wasted the time to test. No, it is not the least bit important. tongue

Link to post
Share on other sites
Originally Posted By: Niemand
As a tangential observation, is univ.party.scen_name an std::string? Because if not, that first line is not likely to do anything useful.
Uh... it's supposed to be, but I think I haven't yet done that...

EDIT: Never mind, it is a std::string. I was thinking of scenario.scen_name, which isn't a string yet.

Originally Posted By: Niemand
I cannot, however, find anything wrong with this code. I even put together a mockup program, which ran fine without crashing:

Are you sure the crash is where you think it is?
I couldn't find anything wrong either. I debugged, and it crashed when stepping over the sout << line. The stack display in the debugger showed that in addition to functions so far, operator << and ostream::sentry::sentry were on the stack. (When I got an exception, only operator << was reached, and I'm guessing an rdbuf() call failed.)

EDIT2: Also, I don't think it's anything to do with what I'm trying to output. I tried sout << x, and got the exception.

Originally Posted By: Niemand
Also: My version of flip_short is 25% faster at -O0 and 75% faster at -O3 than Jeff's. Yes, I wasted the time to test. No, it is not the least bit important. tongue
I don't really feel like changing it. It works, and that's what matters.
Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...