Re: the OP's question, the assumption of the log5 normalization method is that the "era" is a weighted average of the hitter's season and the pitcher's season.
Re: Marshall, he is one of the textbook examples of the problem with the site's pitcher fatigue algorithm. Marshall achieved his 200+ innings by averaging 2 IP per appearance, but on 20 occasions (by my count) he pitched 3 innings for more, including 6 innings on 8/19 and 5 innings on 9/14. In SLB, his fatigue will kick in hard after 2 innings (or more precisely, after about 30 pitches, as described above), so it is impossible to get him his actual innings unless you are willing to let him get shelled on occasion during individual games, or to pitch him when he is already fatigued from previous work.
I don't mind quite as much when modern relievers get hit hard after exceeding an inning or so. It's still not ideal, nor terribly realistic, but it does at least force SLB owners to use modern RPs more or less the way they are actually used today: low PC, typically one inning max, pulled at the first sign of trouble or to gain the platoon advantage.
But for Marshall, or more generally the high-IP relievers of the 60s-70s, or guys who split their season between starting and relieving (e.g., 92 Schilling) or the SPs from earlier decades who frequently relieved in addition to their starting duties (e.g., Mordecai Brown in 1909-1911, Lefty Grove in 1930-31), basing fatigue on AVERAGE IP/G leads to unrealistic results.