I suggest reading the Spark cluster docs first, but even more so this Cloudera blog post explaining these modes.
Your first question depends on what you mean by 'instances'. A node is a machine, and there's not a good reason to run more than one worker per machine. So two worker nodes typically means two machines, each a Spark worker.
Workers hold many executors, for many applications. One application has executors on many workers.
Your third question is not clear.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…