[1]. Tamara Broderick, Michael I Jordan, and Jim Pitman. Cluster and Feature Modeling from Combinatorial Stochastic Processes. Statistical Science, 2013; 28(3): 289–312.
[2]. Changyou Chen, Vinayak Rao, Wray Buntine, and Yee Whye Teh. Dependent normalized random measures. Proceedings of The 30th International Conference on Machine Learning. 2013; 28(1): 969–977.
[3]. S. Favaro, A. Lijoi, and I. Prunster. On the stick-breaking representation of normalized inverse Gaussian priors. Biometrika. , 2012; 99(3): 663–674.
[4]. Stefano Favaro and Yee Whye Teh. MCMC for Normalized Random Measure Mixture Models. Statistical Science. 2013; 28(3): 335–359.
[5]. J. E. Griffin, M. Kolossiatis, and M. F J Steel. Comparing distributions by using dependent normalized random-measure mixtures. Journal of the Royal Statistical Society. Series B: Statistical Methodology. 2013; 75(3): 499–529.
[6]. Fabrizio Leisen, Antonio Lijoi, and Dario Spano. A vector of dirichlet processes. Electronic Journal of Statistics. 2013: 7(1): 62–90.
[7]. Antonio Lijoi and Bernardo Nipoti. Dependent mixture models: Clustering and borrowing information. Computational Statistics and Data Analysis. 2014; 71(1): 417–433.
[8]. Liming Wang and Xiaodong Wang. Hierarchical Dirichlet process model for gene expression clustering. EURASIP journal on bioinformatics & systems biology. 2013; 2013(1): 5-10.
[9] Yi-an Ma, Tianqi Chen, and Emily B Fox. A Complete Recipe for Stochastic Gradient MCMC. In Nips NoMcmc. 2015; 2015(1): 1–19,.
[10]. Mingyuan Zhou and Carin, Lawrence. Negative binomial process count and mixture modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2015; 37(2): 307-320.