Can the likelihood function in generalized linear model be written in terms of the model parameter and the input variable?
In Generalized Linear Models by Peter McCullagh, John Ashworth
Nelder,

the exponential family distribution has the following
form: $$ f_Y(y; theta, phi) = exp { (ytheta – b(theta))/a(phi) + c(y,
phi)} (2.4) $$ Then the mean is $$ EY = b’(theta) $$I was wondering if in general, is $b’$ assumed to be invertible or
injective in generalized linear models, so that we can write $theta
= b’^{1}(EY)$?  The mean is related to the linear predictor via $g(EY) = X beta$. Is the link function $g$ assumed to be injective or invertible in
generalized linear models, so that $EY= g^{1}(Xbeta)$?
The purpose of the two questions is to help me to understand how the parameter $beta$ in generalized linear models is estimated via MLE. In particular, can the pdf of $Y$ be written as $f_Y(y; beta, X, phi)$, for example, by $theta = b’^{1}(g^{1}(Xbeta))$, so that we can apply MLE on $f_Y(y; beta, X, phi)$? Thanks!