Next Article in Journal
Guided Self-Organization in a Dynamic Embodied System Based on Attractor Selection Mechanism
Next Article in Special Issue
The Impact of the Prior Density on a Minimum Relative Entropy Density: A Case Study with SPX Option Data
Previous Article in Journal
Exergy Analysis of Flat Plate Solar Collectors
Previous Article in Special Issue
Parameter Estimation for Spatio-Temporal Maximum Entropy Distributions: Application to Neural Spike Trains
Open AccessArticle

A Relevancy, Hierarchical and Contextual Maximum Entropy Framework for a Data-Driven 3D Scene Generation

Department of Electrical and Computer Engineering, Texas Tech University, 2500 Broadway, Lubbock, TX 79409, USA
Author to whom correspondence should be addressed.
Entropy 2014, 16(5), 2568-2591;
Received: 16 January 2014 / Revised: 16 April 2014 / Accepted: 4 May 2014 / Published: 9 May 2014
(This article belongs to the Special Issue Maximum Entropy and Its Application)
We introduce a novel Maximum Entropy (MaxEnt) framework that can generate 3D scenes by incorporating objects’ relevancy, hierarchical and contextual constraints in a unified model. This model is formulated by a Gibbs distribution, under the MaxEnt framework, that can be sampled to generate plausible scenes. Unlike existing approaches, which represent a given scene by a single And-Or graph, the relevancy constraint (defined as the frequency with which a given object exists in the training data) require our approach to sample from multiple And-Or graphs, allowing variability in terms of objects’ existence across synthesized scenes. Once an And-Or graph is sampled from the ensemble, the hierarchical constraints are employed to sample the Or-nodes (style variations) and the contextual constraints are subsequently used to enforce the corresponding relations that must be satisfied by the And-nodes. To illustrate the proposed methodology, we use desk scenes that are composed of objects whose existence, styles and arrangements (position and orientation) can vary from one scene to the next. The relevancy, hierarchical and contextual constraints are extracted from a set of training scenes and utilized to generate plausible synthetic scenes that in turn satisfy these constraints. After applying the proposed framework, scenes that are plausible representations of the training examples are automatically generated. View Full-Text
Keywords: 3D scene generation; maximum entropy; And-Or graphs 3D scene generation; maximum entropy; And-Or graphs
MDPI and ACS Style

Dema, M.; Sari-Sarraf, H. A Relevancy, Hierarchical and Contextual Maximum Entropy Framework for a Data-Driven 3D Scene Generation. Entropy 2014, 16, 2568-2591.

Show more citation formats Show less citations formats

Article Access Map by Country/Region

Only visits after 24 November 2015 are recorded.
Back to TopTop