We discuss the relation between ϵ-support vector regression (ϵ-SVR) and v-support vector regression (v-SVR). In particular, we focus on properties that are different from those of C-support vector classification (C-SVC) and v-support vector classification (v-SVC). We then discuss some issues that do not occur in the case of classification: the possible range of ϵ and the scaling of target values. A practical decomposition method for v-SVR is implemented, and computational experiments are conducted. We show some interesting numerical observations specific to regression.