## Neural Computation

Causal discovery via the asymmetry between the cause and the effect has proved to be a promising way to infer the causal direction from observations. The basic idea is to assume that the mechanism generating the cause distribution ** p**(

**x**) and that generating the conditional distribution

**(**

*p***y**|

**x**) correspond to two independent natural processes and thus

**(**

*p***x**) and

**(**

*p***y**|

**x**) fulfill some sort of independence condition. However, in many situations, the independence condition does not hold for the anticausal direction; if we consider

**(**

*p***x**,

**y**) as generated via

**(**

*p***y**)

**(**

*p***x**|

**y**), then there are usually some contrived mutual adjustments between

**(**

*p***y**) and

**(**

*p***x**|

**y**). This kind of asymmetry can be exploited to identify the causal direction. Based on this postulate, in this letter, we define an uncorrelatedness criterion between

**(**

*p***x**) and

**(**

*p***y**|

**x**) and, based on this uncorrelatedness, show asymmetry between the cause and the effect in terms that a certain complexity metric on

**(**

*p***x**) and

**(**

*p***y**|

**x**) is less than the complexity metric on

**(**

*p***y**) and

**(**

*p***x**|

**y**). We propose a Hilbert space embedding-based method EMD (an abbreviation for EMbeDding) to calculate the complexity metric and show that this method preserves the relative magnitude of the complexity metric. Based on the complexity metric, we propose an efficient kernel-based algorithm for causal discovery. The contribution of this letter is threefold. It allows a general transformation from the cause to the effect involving the noise effect and is applicable to both one-dimensional and high-dimensional data. Furthermore it can be used to infer the causal ordering for multiple variables. Extensive experiments on simulated and real-world data are conducted to show the effectiveness of the proposed method.