Hello.
During the holidays I've been developing an application on my desktop computer at home.
I setup a repository on github, so when I got back to work I cloned the repo to my laptop.
It wouldn't work.
The app is comprised of a client and a server, strangely enough the server would segfault at a strcpy at the very beginning, while the client would bug me about not supplying a command line parameter (it's supposed to work anyway).
So I ssh-ed into an office machine we use to test things out, cloned the repo and the problems are inverted!
Now it's the client that would segfault while the server pretends a parameter!
My machines are
desktop - i7 2600k with Ubuntu 12.04 x64 eng
laptop - core2duo with Ubuntu 12.04 x64 eng
test pc - core2duo with Ubuntu 10.04 ita (dunno if x86 or x64)
Now I could bear that an app developed on a single pc would require some tinkering on other pcs, but the same app displaying exactly symmetrical behaviour on two different pcs I can't understand.
Anyway the specific code that seems to be the problems is the following
either I get a segfault at the strcpy or somehow the argp_parse exits the program.
I'm not an expert enough to understand why declaring the following is correct
while this is wrong
And even more so I can't understand why, if it's wrong, it would work on my desktop computer!
Any help is really appreciated.
During the holidays I've been developing an application on my desktop computer at home.
I setup a repository on github, so when I got back to work I cloned the repo to my laptop.
It wouldn't work.
The app is comprised of a client and a server, strangely enough the server would segfault at a strcpy at the very beginning, while the client would bug me about not supplying a command line parameter (it's supposed to work anyway).
So I ssh-ed into an office machine we use to test things out, cloned the repo and the problems are inverted!
Now it's the client that would segfault while the server pretends a parameter!
My machines are
desktop - i7 2600k with Ubuntu 12.04 x64 eng
laptop - core2duo with Ubuntu 12.04 x64 eng
test pc - core2duo with Ubuntu 10.04 ita (dunno if x86 or x64)
Now I could bear that an app developed on a single pc would require some tinkering on other pcs, but the same app displaying exactly symmetrical behaviour on two different pcs I can't understand.
Anyway the specific code that seems to be the problems is the following
Code:
struct arguments
{
int *Z_DEBUG, *M_DEBUG;
char * interf;
char * outfile; /* Argument for -o */
};
int main(int argc, char** argv) {
struct arguments arguments;
outstream = stdout;
arguments.M_DEBUG=&MAIN_DEBUG;
arguments.Z_DEBUG=&ZMQ_DEBUG;
strcpy( arguments.interf, "eth0" );
arguments.outfile = NULL;
s_catch_signals();
argp_parse(&argp, argc, argv, 0, 0, &arguments);
I'm not an expert enough to understand why declaring the following is correct
Code:
char *mystring="useless phrase";
Code:
char *mystring;
strcpy(mystring, "useless phrase");
Any help is really appreciated.