Thomas, Author at Enscape https://3.64.72.25/blog/author/thomas-schander/ Instant realtime Rendering plugin for Revit, SketchUp, Rhino, and ArchiCAD Tue, 16 Apr 2019 07:56:26 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.7 https://wordpress-community-media-prod.s3.eu-central-1.amazonaws.com/community/wp-content/uploads/2023/09/06142357/enscape-chaos-favicon-32x32.ico Thomas, Author at Enscape https://3.64.72.25/blog/author/thomas-schander/ 32 32 Enscape for SketchUp released! https://learn.enscape3d.com/blog/sketchup-version-released/ https://learn.enscape3d.com/blog/sketchup-version-released/#respond Thu, 20 Apr 2017 17:06:26 +0000 https://enscape3d.com/?p=19018 Enscape, the real time rendering plugin you (hopefully) love, now works with SketchUp! If you already know Enscape for Revit, you’ll feel right at home. It has the same one click functionality and updates your Enscape scene as you modify your project in SketchUp. You also have the same export options such as 360° panoramas and the […]

The post Enscape for SketchUp released! appeared first on Enscape.

]]>
Enscape, the real time rendering plugin you (hopefully) love, now works with SketchUp! If you already know Enscape for Revit, you’ll feel right at home. It has the same one click functionality and updates your Enscape scene as you modify your project in SketchUp. You also have the same export options such as 360° panoramas and the .exe standalone file. Virtual reality functions for both Oculus and HTC Vive are also available with Enscape for SketchUp and are accessible with a single click.

Up to now, the SketchUp release has been separated from the Revit version. In future, we want to bring these to the same level to make the feature set identical. Right now, some features of Enscape for SketchUp (like the grass and the improved water) are even ahead of the Revit version, but there will be no difference soon.

One Enscape license for all plugins

If you are already an Enscape user, you can use your license for SketchUp too. You can use it for all our current and future plugins. This means you don’t have to buy a separate SketchUp (or Revit vice versa) license. This allows you to use Enscape during your whole workflow across different construction or modeling tools.

How does it work?

Materials


We use a material keyword system to add detail to the limited SketchUp materials. In many cases, this works out of the box if your water is already specified as “water” in SketchUp. You can find a list of possible keywords in our Knowledgebase.

Camera synchronization


You can use the SketchUp navigation scheme by enabling the live camera synchronization. Just hit the camera icon with the blue arrow and Enscape will mimic your SketchUp viewport. This can be handy if you use an external monitor or projector for Enscape and want to edit or navigate from your laptop screen.

Section Planes

SketchUp
Enscape shows the Section Planes as drawn in SketchUp.

What’s next?

The next update will include support for proxy geometry and artificial lights. This will allow you to improve the detail or your scene further.

Become a part of Enscape

We recently relaunched our forum – be our guest and take part in the discussions! Have a look at images that other architects and designers around the world share, and of course you can post your own to hear what others think about your work. There’s a place for questions and ideas, however we’d like to encourage you to email our support for detailed technical problems. You can also use the feedback button of the Enscape toolbar, which sends us your Enscape error logs.

In addition to the forum, we have our public development agenda. We want to create a tool that is useful and inspiring for you, so we need to know what is important to you. Therefore, please vote for features that you want to have in Enscape and suggest ones that are missing.

Meet us!


The BILT conference in Singapore was a great success; it was nice to meet a lot of Enscape users from the Asian region and learn about their architectural work. In the next weeks, we will be in Orlando, Adelaide and Toronto. If you’re near these cities, come on over and visit our booth!

The post Enscape for SketchUp released! appeared first on Enscape.

]]>
https://learn.enscape3d.com/blog/sketchup-version-released/feed/ 0
BILT Asia 2017 in Singapore https://learn.enscape3d.com/blog/bilt-asia-2017-singapore/ https://learn.enscape3d.com/blog/bilt-asia-2017-singapore/#respond Tue, 18 Apr 2017 09:06:46 +0000 https://enscape3d.com/?p=18958 BILT ASIA 2017 This year, we attended a variety of events as both exhibitors and as visitors. This march, we traveled to Singapore to meet with a lot of Enscape users we haven’t met before. It was a great opportunity to see what’s going on in different AEC areas and to see how people progress […]

The post BILT Asia 2017 in Singapore appeared first on Enscape.

]]>

BILT ASIA 2017

This year, we attended a variety of events as both exhibitors and as visitors. This march, we traveled to Singapore to meet with a lot of Enscape users we haven’t met before. It was a great opportunity to see what’s going on in different AEC areas and to see how people progress in their journey towards a BIM workflow. The BILT conference is usually structured around the classes and workshops. Between those two options, you also get the opportunity to meet with other people and get drinks and food. It’s a great combination between learning about new trends and techniques at the classes and connecting with people.

Take a look at our journey through BILT Asia 2017. See what people had to say about their own experiences, and also learn how some use Enscape in their own projects. Conferences such as BILT, are made to bring forth the best of the industry and have them all come together in one space. It gives many the opportunity to learn and bring home fresh ideas and tasks to better their projects and the relationships with clients.

The next BILT conferences will be in Australia and in Denmark – hope to see you there 🙂

The post BILT Asia 2017 in Singapore appeared first on Enscape.

]]>
https://learn.enscape3d.com/blog/bilt-asia-2017-singapore/feed/ 0
Version 1.9 Released https://learn.enscape3d.com/blog/version-1-9/ https://learn.enscape3d.com/blog/version-1-9/#respond Thu, 09 Mar 2017 17:51:33 +0000 https://enscape3d.com/?p=18240 Finally, Enscape 1.9 is here! This took some time due to the amount of things we wanted to include. We took a critical look at every aspect and asked ourselves how we can incorporate your feedback and ideas. Have a look at the video and read about the details below: New Features and Improvements Decals We […]

The post Version 1.9 Released appeared first on Enscape.

]]>
Finally, Enscape 1.9 is here! This took some time due to the amount of things we wanted to include. We took a critical look at every aspect and asked ourselves how we can incorporate your feedback and ideas. Have a look at the video and read about the details below:

New Features and Improvements

Decals

We now support Decals. This enables a new range of visual refinements and fills one of the last small gaps to fully leverage your Revit model.

Improved glass

You can now add textures onto your transparent materials. As well as this, we now realistically refract light passing through glassy. This is called refraction and allows for a very accurate representation of glass elements in your scene. We also improved the support of material parameters for transparent surfaces: Color, transparency, glossiness and bump textures will now translate into Enscape as well.

Polystyrol Mode

The Polystyrol Mode already existed in Enscape – its purpose is to allow you to view your model as a miniature model as if it was made out of card board. Even if you do not yet have a proper material setup, it will look interesting and appealing. With Enscape 1.9 we have gone a step further and added a more physically correct subsurface light scattering to simulate the light behavior in thin material. You can adjust the transmission parameter, depending on the scale that you want your model to represent. In terms of light scattering, the Transmission in a 1:50 model is lower than in a 1:100 model.

Auto Contrast and Sharpening

 

[image-comparator left=”//learn.enscape3d.com/wp-content/uploads/2017/03/blog_beforehist.jpg” right=”//learn.enscape3d.com/wp-content/uploads/2017/03/blog_afterhist.jpg” method=”fade-in” width=”60%” value=”50″ overlayed_slider=”false” hide_slider=”false” link_images=”false”][/image-comparator]

Even the best renderings sometimes need some refinement in image editing software. We want to save your time, so we implemented a few common and handy parameters in Enscape. The sharpening now sharpens your image without creating dark halos around objects. The Auto Contrast expands the levels of your image histogram to give you the maximum color range from black to white. This has nothing to do with exposure – although it can rescue a bad exposure. Usually, a right exposure is half way to a good contrast, but if the lighting is monotone in the particular view, Auto Contrast can help to maximize the visual beauty.

Oculus Touch

We’ve supported the HTC Vive controllers for virtual reality interaction since previous versions already. With Enscape 1.9, we are also adding support for the Touch controls by Oculus. The capabilities of the two are the same. We often heard of confusion from first time VR-users while they were getting comfortable with the controls. Since this can be tedious during client presentations, we changed the controller layout to make it more intuitive. We also added 3D instructions which appear if you look directly at the controllers. This helps both frequent users and novices to have a better VR experience.

Videorecorder

The Videorecorder can now interpolate between different depth of field ranges and allows different frames-per-second settings. Additionally, the bit rate is calculated automatically under Compression Quality based on a desired quality preset. Consequently, you do not have to guess anymore based on your resolution.

Atmosphere

Enscape’s atmosphere and cloud system has always been nice to look at – but we refined it again to give your scenes a greater surrounding. The system is now based on physical measurements of the real-world sky to fully correspond to the appropriate lighting intensities. You have a wide range of sliders to define cloud density in your scene.

It’s faster!

We always dedicate a good portion of our efforts to making Enscape faster and more stable, while adding and improving features at the same time. I’m sure you’ll notice!

We’re coming to a place near you!

Over the next couple of months, we might get the chance to meet each other! Starting in March we are at the CTC Midwest-U in Minneapolis conference, in April at the BILT Asia in Singapore and in May at the BILT Australia in Adelaide. Just approach us at our booth and say hello!

The post Version 1.9 Released appeared first on Enscape.

]]>
https://learn.enscape3d.com/blog/version-1-9/feed/ 0
Version 1.7 Released https://learn.enscape3d.com/blog/enscape-version-1-7/ https://learn.enscape3d.com/blog/enscape-version-1-7/#respond Sun, 06 Nov 2016 22:22:05 +0000 https://enscape3d.com/?p=16671 The post Version 1.7 Released appeared first on Enscape.

]]>

I’m excited to tell you that after a couple months, we once again released a new version of Enscape full of improvements and additions! The last months were unbelievable, we got the chance to meet many of you in person at the RTC North America and RTC Europe. So thanks a lot for the valuable input you gave us, we are trying hard to meet your wishes and even surprise you with things you did not know they were possible 🙂

Again, we fixed many of your reported issues. For example, missing lights or only backside-drawn objects should not be there anymore. We polished almost every existing feature, like the LightView – which is now more accurate, or the fog and various lens effects – which are a lot more realistic and visually appealing. But let me give you a more detailed overview of the biggest additions:

Videorecorder


Enscape now allows you to create MP4 video files within seconds. It works as follows:

  1. Fly to a start position & view angle
  2. Fly to a stop position & view angle
  3. Specify your path duration
  4. Save it as an MP4!

Easy as that. You can also change the time of day or the camera zoom (field of view) between the two key frames. During the video, the camera is blended between the two. You can also specify how the camera should move from start to stop: For example, you can give it a smooth acceleration at the beginning or you could use the handy cam mode to give it slight shakes as if it was recorded with a camcorder, carried around by a human.

videorec

You can press the preview button to get a real time preview of the video without any waiting. Exporting the video can take from a few seconds to a few minutes, depending on which quality you chose. The quality slider affects:

  • Anti-Aliasing quality to hide edges and flickering
  • Better shadows
  • Better motion blur and depth of field quality

For most cases, the medium preset should be totally fine.

ArchVision RPC

In previous versions, we offered the option to replace the RPC entourage in Revit project with high quality 3D models that we ship with Enscape. This is still possible, but of course, we could not offer unique replacement models for the many thousand RPC objects ArchVision offers.

In Enscape 1.7. this has changed! You can now use all the entourage in your ArchVision Dashboard within Enscape. That means, that you can show all the vegetation in its correct appearance. Some of these 2D entourage objects look a little flat – although they are rotating billboards. Therefore, ArchVision has an increasing amount of true 3D models (called 3D+) in their library, which then again renders as truly 3D meshes inside of Enscape. Give it a try, it adds an extra amount of realism and variation to your project – without the need for a special project file beside your Revit project. You can also create your own RPCs using ArchVisions RPC Creator.

Layered Screenshot export

So far, you could already save beautiful rendered images out of Enscape within seconds. However often times, you might want to add one or two things in Photoshop afterwards. No problem! In version 1.7., you can export an alpha mask together with your image. That means that you can separate the foreground from the background in post-production tools like Photoshop to edit it very easily.
[image-comparator left=”//learn.enscape3d.com/wp-content/uploads/2016/11/mask2.jpg” right=”//learn.enscape3d.com/wp-content/uploads/2016/11/mask1.jpg” method=”fade-in” width=”60%” value=”50″ overlayed_slider=”false” hide_slider=”false” link_images=”false”][/image-comparator]

Double click to fly

In Enscape, we offer multiple ways of navigating in your project:

  • Keyboard and mouse
  • SpaceMouse
  • XBox controller
  • Chosing a Revit view (which then sets the camera for you)
  • Using Enscape’s Live Camera and moving the camera object in your Revit plan

We added a simple but useful tool for mouse navigation: If you want to fly somewhere, just double click there. You’ll then be transitioned there smoothly. This allows you for quick view changes during presentations and is even more useful if you hand the navigation over to someone who never used Enscape or a 3D tool before.

Faster loading and rendering

With Enscape, it is all about saving time and focusing on the things that matter: Your architecture. That is why we constantly tweak on speeding up the loading times and lowering the system requirements. Especially at large projects with many families, Enscape should load in a fraction of the time previously needed. This allows you for spontaneous launches of Enscape to get a quick impression of show someone what you are working on.

Also the frame rate has increased: Enscape runs much smoother now, especially if you run it on high resolutions. A common is misconception that if you make it render faster, it has to look worse: We always try to achieve the opposite. It’s so fast that it now even runs with global illumination enabled on Oculus Rift (at Ultra quality level). The newer version runs a lot faster and looks crisper, has less aliasing and flickering, more correct lighting simulation, sharper shadows, increased view distance and a lot more. I hope you will like it – be sure to have a look!

Enscape for Revit

What’s coming next?

First of all, we hope to meet you at the Autodesk University 2016 in Las Vegas! But after that, it won’t take too long for the next update – please check our Development Agenda, where you can always get an impression of what we’re working on. Send us a mail or write a comment if you want to suggest something!

It has been a great time so far, I hope you enjoy Enscape 1.7.! Have fun and make the best out of your projects, let them shine in a great light 🙂

The post Version 1.7 Released appeared first on Enscape.

]]>
https://learn.enscape3d.com/blog/enscape-version-1-7/feed/ 0
Architectural Rendering Glossary https://learn.enscape3d.com/blog/architecural-rendering-glossary/ https://learn.enscape3d.com/blog/architecural-rendering-glossary/#respond Tue, 23 Aug 2016 11:27:53 +0000 https://enscape3d.com/?p=15878 The field of computer graphics is developing very fast. The wording is becoming more and more complex, leading to misunderstandings in technical discussions among architects and other rendering enthusiasts. We want to explain the most important, recent concepts so that you can get the most out of your architecture designs. General Explanations Rendering The term […]

The post Architectural Rendering Glossary appeared first on Enscape.

]]>
The field of computer graphics is developing very fast. The wording is becoming more and more complex, leading to misunderstandings in technical discussions among architects and other rendering enthusiasts. We want to explain the most important, recent concepts so that you can get the most out of your architecture designs.

General Explanations

Rendering

The term “rendering” describes an artificially generated image, unlike a photograph. It does not describe the process behind it, or the quality. The common question “Is this image truly rendered?” to describe the quality of a rendering hence does not make sense. If someone describes an image as rendered, the only contained information is that the image is not a real photo.

Resolution

To save images on a computer, we have to tile them into little pixels. The resolution does only describe the amount of pixels in the image, not the quality, sharpness or anything else. Even if you have a high-resolution rendering, it can still look edgy and unrealistic.

Real time vs. Offline

The generation of an image can last up to days (Offline) or down to milliseconds (Real time). We call a process real time, if it enables you as a user to interact with it without waiting for the result. This is usually possible if the image appears in less than 50 ms which equals more than 20 frames per second.

The timescale between a real time and an offline rendering varies in magnitudes of scale. In contrast, the resulting image does not have to do so. The techniques used to create those images are fundamentally different – an offline image that takes hours is not just a real time image with a lot more time. Generally, you can achieve the highest quality by using offline rendering and investing some time. However, the quality gap between the fast techniques applied in real time rendering and old-school offline rendering is getting smaller, thanks to advances in software research and faster end-user hardware in your computer.

Imagine it as the difference between early digital photography compared to classic film. The underlying process is different, yet it achieves the same effect with different frameworks. With advancing digital photography, the benefit of shooting with true film almost vanishes.

Ray-tracing

To create a rendering, you can assume that the light consists of little particles moving along a ray. In order to simulate this, we are using ray-tracing. Ray tracing is commonly associated with offline rendering, because it is computationally complex. However in modern real time renderers (e.g. like Enscape), we already using ray-tracing for some of the calculations.

Technique Descriptions

(Anti) Aliasing

We want to mimic images from a camera: a photograph. Even a digital photograph consists of pixels. Every pixel on a real photograph describes the average light that hits the sensor over the whole area of this certain pixel (really small – but still an area, not an infinite small point!). Now, if a pixel covers the area where – let’s say – a black wall stands in front of a white background, the pixel would not be purely black or white, it would be gray. This leads to a naturally smooth image.

An aliased border, without proper anti aliasing

An aliased border, without proper anti aliasing

An anti aliased border, as photographed with a real camera

An anti-aliased border, as photographed with a real camera

In computer graphics, we are sometimes tempted to treat a pixel like a single very small point. This makes our gray wall transition pixel either black or white. We do so, because it makes many computations faster, but it creates an unpleasant effect: aliasing.

In order to counteract this outcome, there are a variety of techniques available. Some of them, like e.g. FXAA employ a smart blur filter. Unfortunately, blurring the image does not solve flickering, even if every image itself is “blurred” correctly . The only reliable solution is to render the image at a higher resolution (still aliased) and then to scale it down. We call this super sampling (the way to go, if you have a lot of time) or in a variant multi sampling.

Solution for real time graphics

In current realtime graphics, the computational demand of a super sampled image is temporally divided. The software recycles the last video or animation image while adding new information about the edges image after image. It is called temporal anti aliasing and can make your image a bit blurry or noisy in motion. At least if you have a stationary camera position, the image becomes very sharp and crisp within a fracture of a second.

Physically based rendering

While developing a software that creates a rendering, you have to tell the computer what it should do – in every detail. This includes the question of how the light behaves (without light we would only have a black screen). You can quickly come up with a few basic observations: The light particle is reflected or absorbed if it hits something. It then bounces through the scene, creating shadows and colorful imagery. Until a few years ago, it was still usual to rely on simple assumptions: One part of the light spreads evenly (the diffuse part) and the other one forms a glossy or rough highlight (the specular part).

These observations are not inappropriate, however this is not exactly how nature behaves. The term physically based rendering describes the effort of replicating the real world material interaction with light as close as possible. It turns out that the way a light particle bounces across a surface is strongly angle-dependent. This is based on special scans, where scientists measured the behavior of light on materials under special lighting conditions. Considering those exact measurements, we can now design more complex simulations in order to match the light properties of real world materials, without too many simplifications.

The visual impact of using physically based materials can be little – but leads to an overall coherent and consistent look.

Unbiased rendering

Given your set of materials, lights and your whole scene, mathematics tell us exactly how your image should look. Just like in nature, there is only one sort of visible light and we expect a certain behavior of it. The behavior of light is mathematically described.

If you want to create a truly correct simulation of the light in your image, it might take a long computation time. By obeying the mathematical rules of light, you are then creating an unbiased rendering. However, if you allow an almost invisible difference in image quality (the bias) – the computational process can be a lot faster. The challenge for us developers is to keep the visual bias very low while increasing the speed we get from accepting this bias in our image.

Note that you can create an unbiased rendered image, which is totally aligned with the laws of light without using physically grounded materials.

Global Illumination

If the light hits a surface, it does not end there. For example, in indoor environments, you have areas in your building, which are not illuminated by artificial light and have no direct sight to the bright sky or the sun. However they are not completely dark. They receive the light bounced from other surfaces.

Left: Global Illumination, Right: Only direct light. See the light on the ceiling.

Left: Global Illumination, Right: Only direct light. Compare the light on the ceiling and the reflections on the ground.

These bounces make the lighting not local anymore – it is global. The classic computer game look of the first 3D games emerges from a constant brightness instead of a global illumination depending on the scene geometry. Enscape for example calculates multiple light bounces in real time to simulate the indirect light even in indoor environments.

Auto Exposure

A camera has a limited range of brightness values that it can capture. For example, if you adjust your camera to allows you to see all the details in a dark room, you would see only a plain white image if you now point the camera in broad daylight. The reason for this, is that the potential brightness range in real scenes is much larger than our eyes and a camera can capture. That’s why your eyes have a flexible pupil diameter and a camera has a shutter. At Enscape, we replicate that to make the image look like a photography.

Finding the right exposure is sometimes not trivial. You can of course set the exposure manually (similar to what happens if you use your camera app at your smartphone and tap somewhere on the screen) but that would require a manual readjustment if you move from inside of a building to the sunny outside. That’s why there’s something called auto exposure: The exposure is automatically calculated to ensure a right exposure for most of the screen. If most of the screen is very bright due to the sun, it may happen that darker parts of the screen including weak artificial lights seem to disappear because they are underexposed. Keep in mind that real sun light is magnitudes brighter than most artificial indoor light.

Left: Sunny day, exposure set to correctly capture the outside. Inside looks too dark. Center: Sunny day, exposure set to correctly capture the inside, the outside looks too bright. Right: Clouds adjusted to simulate a cloudy day, now the outside is not as bright anymore. The camera is now able to capture the indoor lighting and outdoor lighting situation without over brightening or darkening.

Ambient Occlusion

Recall the above paragraph about unbiased rendering. Sometimes you have to agree on some limitations to make it non-physical, but faster. That is why graphic developers invented Ambient Occlusion.

Due to the complex phenomena of light bouncing on different surfaces, occluded corners sometimes become a bit darker. This is often the case, but if you look at the corners in the room you are sitting in, you might notice that there is sometimes no darkening at edges at all.

Anyway, it is a common technique to imitate a correct global illumination calculation. The rendering software darkens areas where anything might be in the way of approaching light. This is not very very close to “real” and does not represent the natural properties of light. Yet it enables the viewer to grasp easily the geometric appearance of the scene. Then again, it sometimes looks like some dirt in the corners or even creates images that appear too dark. Screen Space Ambient Occlusion (SSAO) is an even rougher approximation of the normal ambient occlusion. It only considers occlusion from objects inside of your current screen.

Left: Ambient Occlusion, Right: Global Illumination. Note the varying occlusion radius with regard to the scene lighting environment

An unbiased rendering software does not use ambient occlusion, neither do we at Enscape (only if you disable the global illumination). We obtain the effect of naturally appearing darkened corners thanks to the multiple light bounces.

Conclusion

Hopefully this article will contribute to a better understanding of a few important graphic buzzwords. If you do not know Enscape already, then try it now! Enscape is our Revit plugin for realtime renderings at a very high quality level. You simply walk through your architectural project based on the CAD planning data – no export, import or tuning needed! If you liked this post or have any comments, please send us a mail. Thank you!

The post Architectural Rendering Glossary appeared first on Enscape.

]]>
https://learn.enscape3d.com/blog/architecural-rendering-glossary/feed/ 0