In many statistical problems, the data distribution is specified through a generative process for which the likelihood function is analytically intractable, yet inference on the associated model parameters remains of primary interest. We develop a likelihood-free inference framework that combines score matching with gradient-based optimization and bootstrap procedures to facilitate parameter estimation together with uncertainty quantification. The proposed methodology introduces tailored score-matching estimators for approximating likelihood score functions, and incorporates an architectural regularization scheme that embeds the statistical structure of log-likelihood scores to improve both accuracy and scalability. We provide theoretical guarantees and demonstrate the practical utility of the method through numerical experiments, where it performs favorably compared to existing approaches.