A new self-weighted least squares (LS) estimation theory is developed for local unit root (LUR) autoregression with heteroskedasticity. The proposed estimator has a mixed Gaussian limit distribution and the corresponding studentized statistic converges to a standard normal distribution free of the unknown localizing coefficient which is not consistently estimable. The estimator is super consistent with a convergence rate slightly below the OP (n) rate of LS estimation. The asymptotic theory relies on a new framework of convergence to the local time of a Gaussian process, allowing for the sample moments generated from martingales and many other integrated dependent sequences. A new unit root (UR) test in augmented autoregression is developed using self-weighted estimation and the methods are employed in predictive regression, providing an alternative approach to IVX regression. Simulation results showing good finite sample performance of these methods are reported together with a small empirical application.