Bug 81988 - Valgrind crashes when SIGCHLD is received...
Summary: Valgrind crashes when SIGCHLD is received...
Status: RESOLVED DUPLICATE of bug 82114
Alias: None
Product: valgrind
Classification: Developer tools
Component: general (show other bugs)
Version: 2.1.1
Platform: Compiled Sources Linux
: NOR crash
Target Milestone: ---
Assignee: Julian Seward
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2004-05-22 03:07 UTC by Nick Woods
Modified: 2004-07-20 16:23 UTC (History)
0 users

See Also:
Latest Commit:
Version Fixed In:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Nick Woods 2004-05-22 03:07:12 UTC
I have a program that forks a child process, and then performs a "wait" on the 
child pid in the parent process.  After the child process exits with a signal 
17 valgrind appears to crash with the enclosed trace:

I am running valgrind with --tool=none and built the very latest source as of 
5/21/2004 that I fetched via CVS to make sure I had all of the latest bug 
fixes.  I've tried making valgrind follow child processes but have the same 
problem.  Any ideas or potential workarounds?

Nick


==20275== Nulgrind, a binary JIT-compiler for x86-linux.
==20275== Copyright (C) 2002-2004, and GNU GPL'd, by Nicholas Nethercote.
==20275== Using valgrind-2.1.2.CVS, a program supervision framework for x86-
linux.
==20275== Copyright (C) 2000-2004, and GNU GPL'd, by Julian Seward.
==20275== For more details, rerun with: -v
==20275== 
==20275== warning: Valgrind's pthread_attr_destroy does nothing
==20275==          your program may misbehave as a result
==20275== warning: Valgrind's pthread_attr_destroy does nothing
==20275==          your program may misbehave as a result
==20275== warning: Valgrind's pthread_attr_destroy does nothing
==20275==          your program may misbehave as a result
==20275== warning: Valgrind's pthread_cond_destroy is incomplete
==20275==          (it doesn't check if the cond is waited on)
==20275==          your program may misbehave as a result
==20275== warning: Valgrind's pthread_cond_destroy is incomplete
==20275==          (it doesn't check if the cond is waited on)
==20275==          your program may misbehave as a result
==20275== warning: Valgrind's pthread_cond_destroy is incomplete
==20275==          (it doesn't check if the cond is waited on)
==20275==          your program may misbehave as a result
==20302== Nulgrind, a binary JIT-compiler for x86-linux.
==20302== Copyright (C) 2002-2004, and GNU GPL'd, by Nicholas Nethercote.
==20302== Using valgrind-2.1.2.CVS, a program supervision framework for x86-
linux.
==20302== Copyright (C) 2000-2004, and GNU GPL'd, by Julian Seward.
==20302== For more details, rerun with: -v
==20302== 
got signal 17 in LWP 20275 (20275)

valgrind: vg_signals.c:2015 (vg_async_signalhandler): Assertion 
`vgPlain_ksigismember(&uc->uc_sigmask, sigNo)' failed.
==20275==    at 0xB802DE80: vgPlain_skin_assert_fail (vg_mylibc.c:1211)

sched status:

Thread 1: status = WaitSys, associated_mx = 0x0, associated_cv = 0x0
==20275==    at 0x81F85427: poll (../sysdeps/unix/sysv/linux/poll.c:63)
==20275==    by 0x807B672: HttpServer::run(HttpServerConfig &, HttpServerPack 
&, vector<SocketTunnelConfig *, allocator<SocketTunnelConfig *> > &) 
(HttpServer.cxx:963)
==20275==    by 0x8085772: PvAccelerator::start_listen_on_services(void) 
(PvAccelerator.cxx:1834)
==20275==    by 0x808410C: PvAccelerator::run(void) (PvAccelerator.cxx:1609)
==20275==    by 0x808D285: main (pvac.cxx:79)

Thread 2: status = WaitSys, associated_mx = 0x0, associated_cv = 0x0
==20275==    at 0x81F5A5E1: nanosleep (in /lib/i686/libc-2.2.4.so)
==20275==    by 0x815F4935: PvConnectionPoolScrubber::run(void) 
(PvConnection.cxx:837)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 3: status = WaitCV, associated_mx = 0x817E7F8, associated_cv = 0x817E878
==20275==    at 0x81ACB133: pthread_cond_wait (vg_libpthread.c:1293)
==20275==    by 0x8124405: PvCondition::wait(void) 
(../../include/utils/PvCondition.h:133)
==20275==    by 0x815BDAE0: PvTimer::run(void) (PvTimer.cxx:96)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 4: status = WaitCV, associated_mx = 0x81B79E4, associated_cv = 0x81B7A64
==20275==    at 0x81ACB133: pthread_cond_wait (vg_libpthread.c:1293)
==20275==    by 0x8124405: PvCondition::wait(void) 
(../../include/utils/PvCondition.h:133)
==20275==    by 0x815C5C22: PvConsumerPool::getTask(long) 
(PvConsumerPool.cxx:239)
==20275==    by 0x812A5407: PvConsumer::getTask(long) 
(../../../include/utils/PvConsumerPool.h:276)
==20275==    by 0x813A7FCE: _PvAsyncSink_Consumer::run(void) 
(PvAsyncSink.cxx:251)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 5: status = WaitCV, associated_mx = 0x81B9FD4, associated_cv = 0x81BA054
==20275==    at 0x81ACB133: pthread_cond_wait (vg_libpthread.c:1293)
==20275==    by 0x8124405: PvCondition::wait(void) 
(../../include/utils/PvCondition.h:133)
==20275==    by 0x815C5C22: PvConsumerPool::getTask(long) 
(PvConsumerPool.cxx:239)
==20275==    by 0x812A5407: PvConsumer::getTask(long) 
(../../../include/utils/PvConsumerPool.h:276)
==20275==    by 0x813A5AF2: _PvAsyncFileWriter_Consumer::run(void) 
(PvAsyncFileWriter.cxx:77)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 6: status = WaitCV, associated_mx = 0x813AEBA0, associated_cv = 
0x813AEC20
==20275==    at 0x81ACB376: pthread_cond_timedwait (vg_libpthread.c:1329)
==20275==    by 0x812F7EBD: PvCondition::wait(long, long) 
(../../../include/utils/PvCondition.h:158)
==20275==    by 0x815BDC8E: PvTimer::run(void) (PvTimer.cxx:140)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 7: status = WaitCV, associated_mx = 0x81BA96C, associated_cv = 0x81BA9EC
==20275==    at 0x81ACB133: pthread_cond_wait (vg_libpthread.c:1293)
==20275==    by 0x8124405: PvCondition::wait(void) 
(../../include/utils/PvCondition.h:133)
==20275==    by 0x815C5C22: PvConsumerPool::getTask(long) 
(PvConsumerPool.cxx:239)
==20275==    by 0x812A5407: PvConsumer::getTask(long) 
(../../../include/utils/PvConsumerPool.h:276)
==20275==    by 0x813A7FCE: _PvAsyncSink_Consumer::run(void) 
(PvAsyncSink.cxx:251)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 8: status = WaitSys, associated_mx = 0x0, associated_cv = 0x0
==20275==    at 0x81F85427: poll (../sysdeps/unix/sysv/linux/poll.c:63)
==20275==    by 0x8127350F: PvAsyncFetcher::run(void) (PvAsyncFetcher.cxx:247)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 9: status = WaitSys, associated_mx = 0x0, associated_cv = 0x0
==20275==    at 0x81F5A5E1: nanosleep (in /lib/i686/libc-2.2.4.so)
==20275==    by 0x81251696: PvScrubThread::threadScrubLoop(void) 
(PvScrubThread.cxx:328)
==20275==    by 0x8125178D: PvScrubThread::run(void) (PvScrubThread.cxx:348)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 10: status = WaitCV, associated_mx = 0x82CC76C, associated_cv = 
0x82CC760
==20275==    at 0x81ACB133: pthread_cond_wait (vg_libpthread.c:1293)
==20275==    by 0x81CE4EDD: Agentpp::Synchronized::cond_timed_wait(timespec 
const *) (threads.cpp:280)
==20275==    by 0x81CE4E88: Agentpp::Synchronized::wait(void) (threads.cpp:265)
==20275==    by 0x81CE65CA: Agentpp::TaskManager::run(void) (threads.cpp:743)
==20275==    by 0x81CE5530: Agentpp::thread_starter(void *) (threads.cpp:481)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 11: status = WaitCV, associated_mx = 0x82CC88C, associated_cv = 
0x82CC880
==20275==    at 0x81ACB133: pthread_cond_wait (vg_libpthread.c:1293)
==20275==    by 0x81CE4EDD: Agentpp::Synchronized::cond_timed_wait(timespec 
const *) (threads.cpp:280)
==20275==    by 0x81CE4E88: Agentpp::Synchronized::wait(void) (threads.cpp:265)
==20275==    by 0x81CE65CA: Agentpp::TaskManager::run(void) (threads.cpp:743)
==20275==    by 0x81CE5530: Agentpp::thread_starter(void *) (threads.cpp:481)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 12: status = WaitCV, associated_mx = 0x82CC99C, associated_cv = 
0x82CC990
==20275==    at 0x81ACB133: pthread_cond_wait (vg_libpthread.c:1293)
==20275==    by 0x81CE4EDD: Agentpp::Synchronized::cond_timed_wait(timespec 
const *) (threads.cpp:280)
==20275==    by 0x81CE4E88: Agentpp::Synchronized::wait(void) (threads.cpp:265)
==20275==    by 0x81CE65CA: Agentpp::TaskManager::run(void) (threads.cpp:743)
==20275==    by 0x81CE5530: Agentpp::thread_starter(void *) (threads.cpp:481)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 13: status = WaitCV, associated_mx = 0x82CCAAC, associated_cv = 
0x82CCAA0
==20275==    at 0x81ACB133: pthread_cond_wait (vg_libpthread.c:1293)
==20275==    by 0x81CE4EDD: Agentpp::Synchronized::cond_timed_wait(timespec 
const *) (threads.cpp:280)
==20275==    by 0x81CE4E88: Agentpp::Synchronized::wait(void) (threads.cpp:265)
==20275==    by 0x81CE65CA: Agentpp::TaskManager::run(void) (threads.cpp:743)
==20275==    by 0x81CE5530: Agentpp::thread_starter(void *) (threads.cpp:481)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 14: status = WaitSys, associated_mx = 0x0, associated_cv = 0x0
==20275==    at 0x81F86B3E: select (in /lib/i686/libc-2.2.4.so)
==20275==    by 0x81CEF136: Agentpp::RequestList::receive(int) 
(request.cpp:1151)
==20275==    by 0x81C4BBB6: PvSNMPMonitor::run(void) (PvSNMPMonitor.cxx:203)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 15: status = WaitSys, associated_mx = 0x0, associated_cv = 0x0
==20275==    at 0x81F85427: poll (../sysdeps/unix/sysv/linux/poll.c:63)
==20275==    by 0x81B59AAC: PvTransform::OS_Pipe::execute
(PvTransform::VirtualMachine *, PvTransform::Type *, PvTransform::String *, 
PvTransform::MFile *) (OS.cxx:267)
==20275==    by 0x81B59432: PvTransform::OS_Pipe::execute
(PvTransform::VirtualMachine *) (OS.cxx:77)
==20275==    by 0x81B4CE4D: PvTransform::Instruction::execute
(PvTransform::VirtualMachine *) (TypeList.cxx:264)
==20275==    by 0x81B4CF1F: PvTransform::Instruction::execute
(PvTransform::VirtualMachine *) (TypeList.cxx:245)
==20275==    by 0x81B4D341: PvTransform::CompoundProcedure::execute
(PvTransform::VirtualMachine *) (Procedures.cxx:55)
==20275==    by 0x81B4CE4D: PvTransform::Instruction::execute
(PvTransform::VirtualMachine *) (TypeList.cxx:264)
==20275==    by 0x81B4E1D0: PvTransform::IfElse::execute
(PvTransform::VirtualMachine *) (Procedures.cxx:433)

Thread 16: status = WaitSys, associated_mx = 0x0, associated_cv = 0x0
==20275==    at 0x81F5A5E1: nanosleep (in /lib/i686/libc-2.2.4.so)
==20275==    by 0x815FACA2: PvServerMonitor::run(void) 
(PvServerMonitor.cxx:212)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 17: status = WaitCV, associated_mx = 0x835B1F4, associated_cv = 
0x835B274
==20275==    at 0x81ACB376: pthread_cond_timedwait (vg_libpthread.c:1329)
==20275==    by 0x812F7EBD: PvCondition::wait(long, long) 
(../../../include/utils/PvCondition.h:158)
==20275==    by 0x815C5C09: PvConsumerPool::getTask(long) 
(PvConsumerPool.cxx:239)
==20275==    by 0x812A5407: PvConsumer::getTask(long) 
(../../../include/utils/PvConsumerPool.h:276)
==20275==    by 0x8120C932: HttpRequestHandler::run(void) 
(HttpRequestHandler.cxx:158)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 18: status = WaitCV, associated_mx = 0x835B1F4, associated_cv = 
0x835B274
==20275==    at 0x81ACB376: pthread_cond_timedwait (vg_libpthread.c:1329)
==20275==    by 0x812F7EBD: PvCondition::wait(long, long) 
(../../../include/utils/PvCondition.h:158)
==20275==    by 0x815C5C09: PvConsumerPool::getTask(long) 
(PvConsumerPool.cxx:239)
==20275==    by 0x812A5407: PvConsumer::getTask(long) 
(../../../include/utils/PvConsumerPool.h:276)
==20275==    by 0x8120C932: HttpRequestHandler::run(void) 
(HttpRequestHandler.cxx:158)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 19: status = WaitCV, associated_mx = 0x835B1F4, associated_cv = 
0x835B274
==20275==    at 0x81ACB376: pthread_cond_timedwait (vg_libpthread.c:1329)
==20275==    by 0x812F7EBD: PvCondition::wait(long, long) 
(../../../include/utils/PvCondition.h:158)
==20275==    by 0x815C5C09: PvConsumerPool::getTask(long) 
(PvConsumerPool.cxx:239)
==20275==    by 0x812A5407: PvConsumer::getTask(long) 
(../../../include/utils/PvConsumerPool.h:276)
==20275==    by 0x8120C932: HttpRequestHandler::run(void) 
(HttpRequestHandler.cxx:158)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)

Thread 20: status = WaitCV, associated_mx = 0x839CB6C, associated_cv = 
0x839CBEC
==20275==    at 0x81ACB376: pthread_cond_timedwait (vg_libpthread.c:1329)
==20275==    by 0x812F7EBD: PvCondition::wait(long, long) 
(../../../include/utils/PvCondition.h:158)
==20275==    by 0x815C5C09: PvConsumerPool::getTask(long) 
(PvConsumerPool.cxx:239)
==20275==    by 0x812A5407: PvConsumer::getTask(long) 
(../../../include/utils/PvConsumerPool.h:276)
==20275==    by 0x812A49CE: PvPerfMonitor::run(void) (PvPerfMonitor.cxx:308)
==20275==    by 0x815C4FF6: PvThread_threadHandler (PvThread.cxx:136)
==20275==    by 0x81ACA221: thread_wrapper (vg_libpthread.c:843)
==20275==    by 0xB80113DB: do__quit (vg_scheduler.c:1797)


Note: see also the FAQ.txt in the source distribution.
It contains workarounds to several common problems.

If that doesn't help, please report this bug to: valgrind.kde.org

In the bug report, send all the above text, the valgrind
version, and what Linux distro you are using.  Thanks.
Comment 1 Tom Hughes 2004-06-13 20:39:27 UTC
Do you have some sort of test case for this? Only a simple test program with a fork/waitpid combination doesn't seem to reproduce it.
Comment 2 Nick Woods 2004-06-14 00:59:45 UTC
I will try to put together a simple test program.  I believe that it only happens when the fork is in a thread, not if the forking is in the main thread.  It seems as if there is a general problem with signals received by sub-threads and valgrind.  If I can get a simple test case to fail I'll send it.
Comment 3 Tom Hughes 2004-07-20 16:23:47 UTC
This is effectively a duplicate of bug 82114 as the underlying problem in both cases is that valgrind asserts when a signal is received. More information on the cause is available on the other bug so I am closing this one as a duplicate.

*** This bug has been marked as a duplicate of 82114 ***