As in the book, a Poisson process is an arrival process -- omitting the math, some of which is amazing, the vanilla example is Geiger counter clicks. One of the special things, actually an easy calculus exercise, is: Suppose the clicks come on average once a second. Suppose have not gotten a click for time t >= 0. E.g., maybe t is 2 seconds. Now what is the average time to the next click, less than the 1 second? Nope, the average now (a statement in conditional probability) is STILL the 1 second.
For a Markov chain, that is a discrete time, discrete state space Markov process. Halt: In a Markov process, at any time t, the past and future are conditionally independent given the present. E.g., for predicting the future, if know the present, then also knowing the past does not help (or hurt, really). So, Markov is an enormous simplification. In particular, a Poisson process is a Markov process.
For the discrete time, discrete state space Markov process, i.e., a Markov chain, subordinated to a Poisson process, the time to the next change (jump) is a Poisson process. A discrete time, discrete state space Markov process does not have to have anything to do with a Poisson process and, instead, can have, say, one jump an hour on the hour.
But with subordinated to a Poisson process, the expected time to the next jump (arrival), the parameter of the process, can depend on the state.
E.g., in the war at sea problem, as the ships sink, the rate at which they sink can slow down.
There was a WWII calculation by Koopmans that gave the encounter rate of multiple ships at sea. The formula used the area of the sea, the number of ships, etc. So, that is where I got the encounter rates.
Then the main input was the number of Red ships, the number of Blue ships, the speed of each ship, and for any Red-Blue encounter the probability the Red dies, the Blue dies, both die, or neither die.
That was enough data to generate sample paths (war simulations) of the assumed scenario. Silly, sure, but okay for one grad student in a rush in just two weeks!
The above may be enough for you to read and understand the details in the book quickly. But for more the book also covers limiting distributions (as time evolves, the probability distribution of the states approaches a limit), Kolmogorov equations, branching processes, etc. An example of branching processes is some of the simple cases of nuclear fission.
Now you have saved quite a lot of time and gotten ready for a fairly easy time in the book!
For a Markov chain, that is a discrete time, discrete state space Markov process. Halt: In a Markov process, at any time t, the past and future are conditionally independent given the present. E.g., for predicting the future, if know the present, then also knowing the past does not help (or hurt, really). So, Markov is an enormous simplification. In particular, a Poisson process is a Markov process.
For the discrete time, discrete state space Markov process, i.e., a Markov chain, subordinated to a Poisson process, the time to the next change (jump) is a Poisson process. A discrete time, discrete state space Markov process does not have to have anything to do with a Poisson process and, instead, can have, say, one jump an hour on the hour.
But with subordinated to a Poisson process, the expected time to the next jump (arrival), the parameter of the process, can depend on the state.
E.g., in the war at sea problem, as the ships sink, the rate at which they sink can slow down.
There was a WWII calculation by Koopmans that gave the encounter rate of multiple ships at sea. The formula used the area of the sea, the number of ships, etc. So, that is where I got the encounter rates.
Then the main input was the number of Red ships, the number of Blue ships, the speed of each ship, and for any Red-Blue encounter the probability the Red dies, the Blue dies, both die, or neither die.
That was enough data to generate sample paths (war simulations) of the assumed scenario. Silly, sure, but okay for one grad student in a rush in just two weeks!
The above may be enough for you to read and understand the details in the book quickly. But for more the book also covers limiting distributions (as time evolves, the probability distribution of the states approaches a limit), Kolmogorov equations, branching processes, etc. An example of branching processes is some of the simple cases of nuclear fission.
Now you have saved quite a lot of time and gotten ready for a fairly easy time in the book!