Posted in

A Commercial Photographer’s Take on When to Use AI (and When Not To)

The Catalyst: From Traditional Shoots to Digital Synthesis

The transition toward an AI-integrated workflow often begins with a challenge to traditional logistics. For Zhan, a pivotal moment occurred when a client requested a full lookbook for a new clothing line but explicitly stated they did not want a traditional photo shoot. Instead, the client provided mobile phone photos and requested that they be transformed into professional-grade assets using AI. While such a request might have been viewed as a devaluation of the craft by traditionalists, Zhan’s background in 3D and CGI production allowed him to view the proposal through a technical lens of layers and constructed environments.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

This shift mirrors a broader trend in the creative economy. According to data from the 2023 Adobe "State of Creativity" report, nearly 70% of creative professionals have started using AI tools to streamline their workflows, with the most significant gains reported in the conceptualization and post-production phases. Zhan’s adoption of these tools began two years ago, well before generative AI became a mainstream topic of industrial debate. By treating AI as a "new way to build" realities rather than just a way to replace cameras, he has been able to maintain the aesthetic standards required by publications like Harper’s Bazaar and ELLE while significantly reducing the overhead of physical production.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

Pre-Production: The End of the Generic Mood Board

In the traditional commercial photography timeline, the pre-production phase is often the most fraught with potential for error. Historically, photographers relied on mood boards—collages of existing imagery pulled from Pinterest or editorial archives—to communicate a vision to a client. The inherent limitation of this method is that it relies on someone else’s work to describe a new concept. Misunderstandings at this stage often lead to the most expensive mistake in the industry: a misalignment between the client’s imagination and the final delivery, discovered only after the budget has been spent on a physical shoot.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

Zhan has replaced this speculative process with AI-driven concept generation. By using tools like Midjourney or Stable Diffusion, he can generate custom visual interpretations of a brief within hours. If a brand requests a theme that is "urban but soft, editorial but approachable," Zhan no longer searches for approximate references. He generates a dozen distinct visual directions specific to the brand’s identity. This allows the client to choose between versions of their own vision rather than selecting from a menu of other brands’ past campaigns. This "pre-visualization" phase ensures that by the time the crew walks onto a set, the creative direction is no longer an exploration but a precise execution.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

Hybrid Production: Studio Control and Synthetic Environments

The most significant impact of AI on Zhan’s work is seen in the "hybrid shoot," where the subject is captured via traditional photography and the environment is generated digitally. This approach was utilized in a high-profile editorial for Harper’s Bazaar, where the brief called for surreal landscapes, including floating rock formations and misty, otherworldly atmospheres.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

The Logistics of the Hybrid Model:

  1. Controlled Capture: The talent is photographed in a studio against a neutral background. This allows for absolute control over lighting direction, which must be mathematically consistent with the intended AI-generated background.
  2. Environmental Synthesis: AI is used to generate backgrounds that would be physically impossible or prohibitively expensive to build as sets or visit as locations.
  3. Physical Integration: Critical details such as ground shadows, eyelines, and physical contact points (e.g., a model’s feet on the ground) are meticulously managed to ensure the composite feels grounded in reality.

This methodology was also applied to work for iSLAND magazine, involving underwater environments. Traditional underwater photography requires specialized housing for cameras, safety divers, and significant insurance premiums for talent. By shooting in a dry studio and using AI to generate the water, jellyfish, and deep-sea light refraction, Zhan produced imagery that looked like a high-budget location shoot at a fraction of the cost and risk.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

The Economic and Technical Limitations of Full AI Generation

While the hybrid model is thriving in the high-end editorial space, Zhan has also experimented with fully AI-generated visuals for smaller brands. In these instances, the "no-shoot" model becomes a reality. Using basic iPhone photos as a structural reference, AI can generate lifestyle imagery that satisfies the demands of social media marketing.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

However, professional-grade analysis reveals that the technology is not yet a universal replacement. For established brands, "product accuracy" remains the primary hurdle. Zhan notes that while AI is excellent at atmosphere and lighting, it often struggles with the high-fidelity textures of specific fabrics, the exactness of stitching, and the way specific materials catch light.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

Current Technical Gaps:

  • Fabric Integrity: AI often "hallucinates" textures, changing silk to synthetic-looking surfaces.
  • Brand Consistency: Maintaining the exact silhouette of a specific garment across multiple AI-generated frames remains difficult without extensive manual retouching.
  • The Cost of Precision: As the requirement for product accuracy increases, the time spent "prompting" and "fixing" AI images can eventually equal the time and cost of a traditional photo shoot.

Data from industry analysts at Gartner suggest that while 80% of creative tasks will be "augmented" by AI by 2026, the need for human oversight in "brand-critical" assets will remain paramount. Zhan’s assessment aligns with this: for low-budget social content, AI is ready; for high-integrity brand campaigns, the camera remains an essential anchor of truth.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

Chronology of the AI Integration in Commercial Photography

  • 2018–2021: Traditional commercial dominance. Photographers use AI-powered retouching tools (like Adobe’s "Content-Aware Fill") but rely 100% on captured frames.
  • Late 2022: The release of Stable Diffusion and Midjourney V4. Early adopters like Zhan begin experimenting with AI for mood boards and conceptualization.
  • 2023: The "Hybrid Year." Professional studios begin integrating AI backgrounds into studio-shot campaigns to circumvent rising travel costs and environmental concerns.
  • 2024–Present: The "Direct-to-AI" movement. Smaller e-commerce brands begin requesting shoots without cameras, utilizing "Image-to-Image" AI workflows.

Broader Impact and the Future of the Medium

The integration of AI into commercial photography is forcing a redefinition of the "photographer" role. The position is evolving into that of a "Creative Technologist" or "Visual Director," where the skill lies not just in operating a shutter, but in managing a complex synthesis of captured and generated data.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

There are also significant implications for the labor market within the industry. Traditional roles such as location scouts, set builders, and certain categories of assistants may see a decline in demand as environments become digital. Conversely, there is a rising demand for AI artists and "prompt engineers" who understand the nuances of photographic lighting and composition.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

Zhan’s perspective on this transition is one of "clarification" rather than "obliteration." He argues that as AI takes over the "construction" of commercial imagery, the camera will return to its original purpose: an instrument for recording reality and capturing moments that cannot be manufactured. By stripping photography of its purely commercial utility—the need to build fake worlds for products—AI may actually liberate the medium to be more honest.

A Commercial Photographer’s Take on When to Use AI (and When Not To)

The trajectory is clear: the gap between AI-generated and camera-captured imagery is closing. For the commercial industry, the studio is no longer just a room with lights; it is a portal to any environment imaginable. As Zhan’s work demonstrates, the photographers who thrive in this new era will be those who can master the machine without losing the human eye for detail, light, and soul. Commercial photography is not dying; it is being rebuilt, layer by layer, into a more efficient and visually ambitious discipline.

Leave a Reply

Your email address will not be published. Required fields are marked *