Up to this point we've been creating our models manually. While this is an acceptable way to do this, but it's really slow if we want to include complex models with lots of polygons. Because of this, we're going modify our code to leverage the obj model format so that we can create a model in a software such as blender and display it in our code.
Our `main.rs` file is getting pretty cluttered, let's create a `model.rs` file that we can put our model loading code into.
You'll notice a couple of things here. In `main.rs` we had `Vertex` as a struct, here we're using a trait. We could have multiple vertex types (model, UI, instance data, etc.). Making `Vertex` a trait will allow us to abstract our the `VertexBufferDescriptor` creation code to make creating `RenderPipeline`s simpler.
Another thing to mention is the `normal` field in `ModelVertex`. We won't use this until we talk about lighting, but will add it to the struct for now.
This is basically the same as the original `VertexBufferDescriptor`, but we added a `VertexAttributeDescriptor` for the `normal`. Remove the `Vertex` struct in `main.rs` as we won't need it anymore, and use our new `Vertex` from model for the `RenderPipeline`.
```rust
let render_pipeline = device.create_render_pipeline(&wgpu::RenderPipelineDescriptor {
With all that in place we need a model to render. If you have one already that's great, but I've supplied a [zip file](https://github.com/sotrh/learn-wgpu/tree/master/code/beginner/tutorial9-model/src/res) with the model and all of it's textures. We're going to put this model in a new `res` folder.
Speaking of textures, let's add a `load()` method to `Texture` in `texture.rs`.
The `load` method will be useful when we load the textures for our models, as `include_bytes!` requires that we know the name of the file at compile time which we can't really guarantee with model textures.
We're going to use the [tobj](https://docs.rs/tobj/0.1.12/tobj/) library to load our model. Before we can load our model though, we need somewhere to put it.
```rust
// model.rs
pub struct Model {
pub meshes: Vec<Mesh>,
pub materials: Vec<Material>,
}
```
You'll notice that our `Model` struct has a `Vec` for the `meshes`, and for `materials`. This is important as our obj file can include multiple meshes and materials. We still need to create the `Mesh` and `Material` classes, so let's do that.
```rust
pub struct Material {
pub name: String,
pub diffuse_texture: texture::Texture,
}
pub struct Mesh {
pub name: String,
pub vertex_buffer: wgpu::Buffer,
pub index_buffer: wgpu::Buffer,
pub num_elements: u32,
pub material: Option<usize>,
}
```
The `Material` is pretty simple, it's just the name and one texture. Our cube obj actually has 2 textures, but one is a normal map, and we'll get to those [later](./intermediate/normal-mapping). The name is more for debugging purposes.
`Mesh` holds a vertex buffer, an index buffer, and the number of indices in the mesh. We're using an `Option<usize>` for the material, as not all meshes are guaranteed to have a material. This `usize` will be used to index the `materials` list when it comes time to draw.
With all that out of the way, we can get to loading our model.
Make sure that you change the `IndexFormat` that the `RenderPipeline` uses from `Uint16` to `Uint32`. Tobj stores the indices as `u32`s, so using a lower bit stride will result in your model getting mangled.
```rust
let render_pipeline = device.create_render_pipeline(&wgpu::RenderPipelineDescriptor {
// ...
index_format: wgpu::IndexFormat::Uint32,
// ...
});
```
## Rendering a mesh
Before we can draw the model, we need to be able to draw an individual mesh. Let's create a trait called `DrawModel`, and implement it for `RenderPass`.
We could have put this methods in `impl Model`, but I felt it made more sense to have the `RenderPass` do all the rendering, as that's kind of it's job. This does mean we have to import `DrawModel` when we go to render though.
render_pass.draw_mesh_instanced(&self.obj_model.meshes[0], 0..self.instances.len() as u32);
```
Before that though we need to actually load the model and save it to `State`. Put the following in `State::new()`.
```rust
let (obj_model, cmds) = model::Model::load(&device, "code/beginner/tutorial9-models/src/res/cube.obj").unwrap();
queue.submit(&cmds);
```
The path to the obj will be different for you, so keep that in mind.
Wit all that done, you should get something like this.
![cubes.png](./cubes.png)
## Using the correct textures
If you look at the texture files for our obj, you'll see that they don't match up to our obj. The texture we want to see is this one,
![cube-diffuse.jpg](./cube-diffuse.jpg)
but we're still getting our happy tree texture.
The reason for this is quite simple. Though we've created our textures we haven't created a bind group to give to the `RenderPass`. We're still using our old `diffuse_bind_group`.