A growing body of research points to the beneficial effects that exposure to the natural world has on health, reducing stress and promoting healing. Now, policymakers, employers, and healthcare providers are increasingly considering the human need for nature in how they plan and operate.