30 seconds of code

Curated collection of useful Javascript snippets that you can understand in 30 seconds or less.

阅读更多
Project 4 of Statistics.md

Homework 6

1.

Prove that $\hat \sigma^2 = \dfrac{SS_E}{(a-1)(b-1)}$ is the unbiased estimation of $\sigma^2$

阅读更多
Project 3 of Statistics.md

Homework 5

13.13

The gamma probability density function is

$$
f(y,r,\lambda) = \frac{\lambda^r}{\Gamma(r)} e^{-\lambda y} y^{r-1}, \quad\text{for } y,\lambda\geq 0
$$

Show that the gamma is a member of the exponential family.

阅读更多
julia基于GLM包的线性回归

本文利用GLM包进行简单线性回归分析,同时对模型检验以及置信区间作图。

1
using GLM,DataFrames
阅读更多
Project 2 of Statistics.md

摘要

本次大作业按章节依次展示了岭估计、主成分估计、逐步回归和带示性变量的回归分析。

主要使用 dplyrreadxl 包导入数据并获得数据框视图;利用 MASScar 包作岭迹图、主成分估计和逐步回归分析; dummies 包提供了将示性变量转换为特殊矩阵的可能;bootstrap 包帮助我们实现了 $k$ 重交叉验证。

阅读更多
Assignment 2 of Advanced Computational Method

Homework 3

The Metropolis–Hastings algorithm works by generating a sequence of sample values in such a way that, as more and more sample values are produced, the distribution of values more closely approximates the desired distribution, $P(x)$. These sample values are produced iteratively, with the distribution of the next sample being dependent only on the current sample value (thus making the sequence of samples into a Markov chain). Specifically, at each iteration, the algorithm picks a candidate for the next sample value based on the current sample value. Then, with some probability, the candidate is either accepted (in which case the candidate value is used in the next iteration) or rejected (in which case the candidate value is discarded, and current value is reused in the next iteration)−the probability of acceptance is determined by comparing the values of the function $f(x)$ of the current and candidate sample values with respect to the desired distribution $P(x)$.

阅读更多