The data collection period spanned 120 days, covering the project’s critical construction phases. The implementation process was divided into three stages:
Phase One: Research Preparation (April 2025): Communicate research requirements with the project’s general contractor and client (Kashi Municipal Housing and Urban–Rural Development Bureau). Obtain a list of project personnel and their contact information. Develop the research plan and train research personnel. Clarify research ethics (e.g., data anonymization, restricted use for academic research only).
Phase Two: Formal Research (May–June 2025): Online distribution of questionnaires via internal corporate communication groups and the Wenjuanxing platform. Offline visits to the project site to conduct one-on-one interviews with key personnel, including design leads, construction team leaders, and project managers. Simultaneously collect questionnaire responses. Interviews will be limited to 30–40 min, fully recorded, and subsequently transcribed into text documents.
Phase Three: Data Screening and Organization (July 2025): Conduct validity screening of collected questionnaires based on the following criteria: ① All core questions must be fully completed; ② Responses must show no discernible patterns (e.g., consistently selecting the same option); ③ Verification that respondents were indeed project participants (confirmed via job titles or project involvement documentation). Ultimately, 112 questionnaires were collected, with 105 deemed valid—a 93.75% valid response rate. Eighteen valid interview transcripts were gathered, covering all core project stakeholders.
3.3.1. Semi-Structured Interview Outline
Centered on the core theme of “EPC landscape project design–construction interface management”, the interview outline integrates the four dimensions identified earlier—contract management, organizational coordination, technical support, and ecological adaptation—while tailoring questions to reflect respondents’ specific job roles. Structured into 6 modules with 32 questions, it ensures content aligns with the research theme while uncovering latent pain points and practical insights. Details follow:
Module 1: Confirmation of Respondent Background Information (5 questions, introductory)
1. What is your current position? What specific responsibilities did you undertake in this project (or comparable reference project)?
2. How long have you been involved in this project (or a comparable reference project)? Which project phase(s) did you participate in (design phase, construction phase, full process)?
3. How many EPC landscape projects have you participated in previously? Do you have experience with EPC projects in arid regions or ecological restoration?
4. Please describe your specific role and core responsibilities within this project.
5. Please outline key design–construction interface coordination activities you have participated in.
Module 2: Contract Management Dimension Interface Issues Interview (5 questions total, 4 core questions + 1 follow-up)
1. Do you believe the contract for this project clearly delineates responsibilities between the design and construction phases? In which specific areas (e.g., design handover, construction changes, quality acceptance) do responsibilities appear ambiguous?
2. Does the contract explicitly define communication protocols between the design and construction parties, as well as methods for handling breaches? Are these provisions feasible in actual implementation?
3. Based on your experience, can incomplete contract terms lead to design–construction interface conflicts? If so, what specific conflicts might arise (e.g., cost disputes, schedule delays)?
4. Provide examples of how contractual rights and responsibilities become defined or ambiguous during wet area construction or special processes (e.g., saline–alkali soil remediation).
Follow-up: How do you suggest optimizing contract terms to reduce responsibility disputes at the design–construction interface?
Module 3: Organizational Coordination Dimension Interface Issues Interview (6 questions: 5 core + 1 follow-up)
1. Did this project establish a collaborative coordination mechanism among designers, contractors, owners, and supervisors? (e.g., regular meetings, dedicated coordination teams)
2. Did contractors participate in collaborative design reviews during the design phase? If involved, what was the depth of involvement (e.g., providing input only, full participation in schematic design)? What were the reasons for non-participation or insufficient involvement?
3. Is information flow between the design and construction phases smooth? What barriers to information transfer exist (e.g., delayed information, information discrepancies, unclear communication channels)?
4. How would you rate the coordination efficiency between the design lead and construction lead on this project? What are the core factors affecting coordination efficiency (e.g., role authority, professional differences, and communication attitude)?
5. Describe the primary coordination mechanisms between the design and construction teams. When and how did the construction team engage in the design phase? Share an example of a problem successfully resolved through collaboration or one where collaboration was insufficient.
Follow-up: Among similar EPC landscape projects, which organizational coordination model (e.g., integrated management team, dedicated liaison) do you believe most effectively enhances coordination efficiency at the design–construction interface?
Module 4: Technical Support and Ecological Adaptation Dimension Interface Issues Interview (8 questions: 7 core + 1 follow-up)
1. Did the project design adequately consider construction feasibility, economic viability, and the ecological characteristics of Kashi’s arid region (e.g., water scarcity, vegetation adaptability)?
2. Did the design scheme undergo changes during implementation due to excessive technical complexity or insufficient ecological adaptability? If so, what types of changes occurred?
3. Did the contractor promptly communicate on-site construction challenges and special ecological restoration requirements to the design team? How responsive was the design team?
4. Did this project utilize digital technologies (e.g., BIM) to support design–construction interface management? What were the outcomes? If not applied, why?
5. What technical tools (e.g., drawings, models, platforms) were used for information transfer? Evaluate their effectiveness and identify issues (e.g., information errors/omissions, update delays).
6. Considering the unique challenges of ecological restoration in arid regions, what specific technical and ecological difficulties do you believe exist in design–construction interface management?
7. Ecological and regional adaptation: To what extent did the design scheme account for arid zone ecological characteristics (e.g., climate, soil, hydrology)? What difficulties arose during construction due to insufficient ecological data or inadequate consideration of regional conditions in the design?
Follow-up: What technical approaches should be introduced, and which design-to-construction transition processes should be optimized to enhance interface management in arid zone EPC landscape projects?
Module 5: Summary of Interface Management Pain Points and Optimization Recommendations (7 questions, open-ended)
1. Overall, what do you consider to be the three most prominent issues in design–construction interface management for this project? Please provide examples based on your practical experience.
2. What specific optimization recommendations do you have for these prominent issues? (You may address any of the following dimensions: contract, organization, technology, or ecological adaptation.)
3. What unique characteristics do you observe in interface management for similar EPC landscape projects in the Kashi region of Xinjiang (an arid zone) compared to projects in other regions? What should be prioritized?
4. Do you believe the proposed optimization directions for interface management (root layer, transition layer, direct layer) align with actual project needs? What additional suggestions do you have?
5. What do you consider the root causes of the aforementioned interface issues (e.g., institutional, procedural, technical, and cognitive levels)?
6. Based on your experience, what specific recommendations would you offer to enhance interface management effectiveness in such arid region EPC landscape projects?
7. How do you envision digital technologies (e.g., BIM and collaboration platforms) being better applied to address these challenges in the future?
Module 6: Summary and Supplement (1 Question)
1. Do you have any other important insights or experiences regarding interface management in this project that you wish to add?
3.3.2. Interview Data Coding and Analysis Process
As the core foundation of qualitative research, interview data complements quantitative survey data to uncover the underlying causes of interface management issues, validate quantitative findings, and provide practical evidence for developing optimization strategies. The coding and analysis of this interview data strictly followed the research logic of grounded theory, utilizing NVivo 12 software to assist. The process was divided into three stages: “open coding → axial coding → selective coding”, combined with manual verification to ensure objectivity, accuracy, and logical consistency. The specific process is as follows:
Coding preparation requires standardizing the interview transcripts first. All 18 valid interview recordings were transcribed into Word documents, with irrelevant content (e.g., introductory pleasantries, repetitive statements, slips of the tongue) removed. Text formatting was unified, and each transcript was annotated with the interviewee ID (e.g., F1-Design Lead, S1-Construction Crew Leader, G1-Project Manager), position, and project type to ensure readability and consistency. Second, a coding team (2 researchers + 1 industry expert) was formed with clear coding principles: ① Objectivity: Coding strictly followed interview content without personal judgment; ② Relevance: Coding must relate to the core topic of “EPC landscape project design–construction interface management”; ③ Consistency: The team pre-established coding standards and keyword definitions to prevent coding discrepancies. Finally, develop a coding manual outlining the coding process and symbol conventions to guide subsequent coding work.
Phase One involves open coding (initial coding), whose core purpose is to “break down textual integrity and extract raw concepts”. This entails analyzing standardized interview transcripts sentence by sentence and paragraph by paragraph to identify statements relevant to the research theme, distill initial concepts, and categorize similar initial concepts into preliminary categories.
Specific procedure: The coding team independently coded all 18 interview transcripts, screening sentence by sentence for valid statements. For example, when Interviewee F1 stated, “The contract fails to clearly define the responsible party for design handover, leading to mutual blame between designers and contractors and delaying construction progress”, the core information was extracted to form the initial concept: “ambiguous contractual responsibility allocation—unclear design handover accountability”. Interviewee S1 stated, “The design team failed to consider Kashi’s water scarcity before construction, resulting in an impractical landscaping plan that required repeated design revisions”. This led to the initial concept: “insufficient ecological adaptability of design plans—lack of consideration for water resources in arid regions”.
During coding, ambiguous or unclear statements were interpreted based on interview context and the interviewee’s role, with verification through coding team discussions when necessary. Repeated original concepts (e.g., “delayed information transfer”, “contractor not involved in preliminary design”) were assigned uniform codes to prevent duplicate categorization. Through open coding, 426 original statements were extracted, yielding 158 initial concepts. By merging similar concepts and eliminating irrelevant ones (e.g., construction technical details unrelated to interface management), 52 initial categories were ultimately formed. These encompassed contract management (8), organizational coordination (16), technical support (14), ecological adaptation (10), and supplementary matters (4). This established the foundation for subsequent main-axis coding.
The second phase involves core coding (associative coding), whose primary objective is to “organize the intrinsic connections among initial categories and establish primary and secondary categories”. This entails analyzing causal relationships, subordinate relationships, and associative relationships among the 52 initial categories to integrate dispersed initial categories, assigning them to corresponding primary categories while clarifying hierarchical relationships between primary and secondary categories (upgraded initial categories) to form a systematic category framework.
Specific Implementation: The coding team analyzed each of the 52 initial categories individually, integrating core research dimensions (contract management, organizational coordination, technical support, and ecological adaptation) to map inter-category relationships. For example, eight initial categories—including “unclear contractual responsibility allocation—ambiguous design briefing responsibilities”, “contracts lacking clear breach handling procedures”, and “inadequate contract communication mechanisms”—were consolidated into the primary category “inadequate contract management”, with the eight initial categories serving as its subcategories. Similarly, sixteen initial categories—such as “contractor non-participation in preliminary design”, “delayed information transmission”, and “lack of collaborative coordination mechanisms”—were grouped into the primary category “low organizational coordination efficiency”. This is further subdivided into three secondary subcategories: “insufficient preliminary collaboration”, information transmission barriers”, and “inadequate coordination mechanisms”, with the original sixteen initial categories serving as tertiary subcategories.
During this phase, the coding team resolved ambiguities in category relationships and attribution discrepancies through multiple discussions and cross-validation. For instance, “BIM technology not applied” was assigned as a subcategory under both the primary category “Insufficient technical support” and the primary category “Low organizational coordination efficiency”, clarifying its dual attributes. Ultimately, a coding system comprising 4 primary categories (aligned with the four major influencing factor dimensions mentioned earlier), 12 secondary subcategories, and 52 tertiary subcategories was established, enabling systematic classification of interview data.
The third phase involved selective coding (core coding), whose primary purpose was to “extract core categories and construct a theoretical framework”. This entailed identifying, from the four main categories, those that could encompass all main and subcategories and run throughout the interview texts. This involved clarifying the logical relationships between core categories, primary categories, and subcategories to form a comprehensive qualitative analysis framework. Concurrently, quantitative data from the questionnaire survey were used to validate the rationality of the category system.