hadoop - "single point of failure" job tracker node goes down and Map jobs are either running or writing the output -



hadoop - "single point of failure" job tracker node goes down and Map jobs are either running or writing the output -

i'm new hadoop , know happens when "single point of failure" job tracker node goes downwards , map jobs either running or writing output. jobtracker starts mapjobs on 1 time again ?

job tracker single point of failure meaning if goes downwards wont able submit additional map/reduce jobs , existing jobs killed.

when restart job tracker, need resubmit whole job again.

hadoop mapreduce

Comments

Popular posts from this blog

java - How to set log4j.defaultInitOverride property to false in jboss server 6 -

c - GStreamer 1.0 1.4.5 RTSP Example Server sends 503 Service unavailable -

Using ajax with sonata admin list view pagination -