We consider the design of optimal quantizers for the distributed estimation of a deterministic parameter. In particular, we design deterministic scalar quantizers to maximize the minimum asymptotic relative efficiency (ARE) between quantized and unquantized ML estimators. We first design identical quantizers using the class of score-function quantizers (SFQ). We show that the structure of SFQs generally depend on the parameter value, but can be expressed as thresholds on the sufficient statistic for a large class of distributions.We provide a convergent iterative algorithm to obtain the best SFQ that maximizes the minimum ARE for distributions of that class.We compare the performance of the optimal SFQ with a general quantizer designed without making any restrictions on the structure. This general quantizer is hard to implement due to lack of structure, but is optimal if the iterative design algorithm does not encounter local minima. Through numerical simulations, we illustrate that the two quantizers designed are identical. In other words, the optimal quantizer structure is that of an SFQ. For a distributed estimation setup, designing identical quantizers is shown to be suboptimal. We, therefore, propose a joint multiple quantizer design algorithm based on a person-byperson optimization technique employing the SFQ structure. Using numerical examples, we illustrate the gain in performance due to designing nonidentical quantizers.