Next Article in Journal
Guided Self-Organization in a Dynamic Embodied System Based on Attractor Selection Mechanism
Next Article in Special Issue
The Impact of the Prior Density on a Minimum Relative Entropy Density: A Case Study with SPX Option Data
Previous Article in Journal
Exergy Analysis of Flat Plate Solar Collectors
Previous Article in Special Issue
Parameter Estimation for Spatio-Temporal Maximum Entropy Distributions: Application to Neural Spike Trains
Article Menu

Export Article

Open AccessArticle
Entropy 2014, 16(5), 2568-2591;

A Relevancy, Hierarchical and Contextual Maximum Entropy Framework for a Data-Driven 3D Scene Generation

Department of Electrical and Computer Engineering, Texas Tech University, 2500 Broadway, Lubbock, TX 79409, USA
Author to whom correspondence should be addressed.
Received: 16 January 2014 / Revised: 16 April 2014 / Accepted: 4 May 2014 / Published: 9 May 2014
(This article belongs to the Special Issue Maximum Entropy and Its Application)
View Full-Text   |   Download PDF [2852 KB, uploaded 24 February 2015]


We introduce a novel Maximum Entropy (MaxEnt) framework that can generate 3D scenes by incorporating objects’ relevancy, hierarchical and contextual constraints in a unified model. This model is formulated by a Gibbs distribution, under the MaxEnt framework, that can be sampled to generate plausible scenes. Unlike existing approaches, which represent a given scene by a single And-Or graph, the relevancy constraint (defined as the frequency with which a given object exists in the training data) require our approach to sample from multiple And-Or graphs, allowing variability in terms of objects’ existence across synthesized scenes. Once an And-Or graph is sampled from the ensemble, the hierarchical constraints are employed to sample the Or-nodes (style variations) and the contextual constraints are subsequently used to enforce the corresponding relations that must be satisfied by the And-nodes. To illustrate the proposed methodology, we use desk scenes that are composed of objects whose existence, styles and arrangements (position and orientation) can vary from one scene to the next. The relevancy, hierarchical and contextual constraints are extracted from a set of training scenes and utilized to generate plausible synthetic scenes that in turn satisfy these constraints. After applying the proposed framework, scenes that are plausible representations of the training examples are automatically generated. View Full-Text
Keywords: 3D scene generation; maximum entropy; And-Or graphs 3D scene generation; maximum entropy; And-Or graphs
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Share & Cite This Article

MDPI and ACS Style

Dema, M.; Sari-Sarraf, H. A Relevancy, Hierarchical and Contextual Maximum Entropy Framework for a Data-Driven 3D Scene Generation. Entropy 2014, 16, 2568-2591.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top